Oct 03 14:42:55 crc systemd[1]: Starting Kubernetes Kubelet... Oct 03 14:42:55 crc restorecon[4682]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:55 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:42:56 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:42:57 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:42:57 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:42:57 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:42:57 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:42:57 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:42:57 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:42:57 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:42:57 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:42:57 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:42:57 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:42:57 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:42:57 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:42:57 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:42:57 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:42:57 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:42:57 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 03 14:42:57 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:42:57 crc restorecon[4682]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 14:42:57 crc restorecon[4682]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 14:42:57 crc restorecon[4682]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 14:42:57 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 14:42:57 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 14:42:57 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 14:42:57 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 14:42:57 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 14:42:57 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 14:42:57 crc restorecon[4682]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 14:42:57 crc restorecon[4682]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 03 14:42:58 crc kubenswrapper[4774]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 14:42:58 crc kubenswrapper[4774]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 03 14:42:58 crc kubenswrapper[4774]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 14:42:58 crc kubenswrapper[4774]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 14:42:58 crc kubenswrapper[4774]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 03 14:42:58 crc kubenswrapper[4774]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.723094 4774 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.735832 4774 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.735896 4774 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.735908 4774 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.735919 4774 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.735928 4774 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.735937 4774 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.735946 4774 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.735984 4774 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.735996 4774 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736006 4774 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736015 4774 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736025 4774 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736039 4774 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736052 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736063 4774 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736073 4774 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736084 4774 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736094 4774 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736105 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736115 4774 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736124 4774 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736133 4774 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736141 4774 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736151 4774 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736160 4774 feature_gate.go:330] unrecognized feature gate: Example Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736169 4774 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736179 4774 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736188 4774 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736197 4774 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736207 4774 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736216 4774 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736226 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736234 4774 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736243 4774 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736252 4774 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736262 4774 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736277 4774 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736288 4774 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736297 4774 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736306 4774 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736315 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736325 4774 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736333 4774 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736357 4774 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736366 4774 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736410 4774 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736420 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736429 4774 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736437 4774 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736446 4774 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736455 4774 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736465 4774 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736475 4774 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736484 4774 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736494 4774 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736504 4774 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736515 4774 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736525 4774 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736535 4774 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736559 4774 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736569 4774 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736577 4774 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736584 4774 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736595 4774 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736605 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736615 4774 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736626 4774 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736635 4774 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736648 4774 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736658 4774 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.736668 4774 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740050 4774 flags.go:64] FLAG: --address="0.0.0.0" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740080 4774 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740094 4774 flags.go:64] FLAG: --anonymous-auth="true" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740100 4774 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740108 4774 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740113 4774 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740121 4774 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740128 4774 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740142 4774 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740148 4774 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740154 4774 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740160 4774 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740165 4774 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740170 4774 flags.go:64] FLAG: --cgroup-root="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740176 4774 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740181 4774 flags.go:64] FLAG: --client-ca-file="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740185 4774 flags.go:64] FLAG: --cloud-config="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740190 4774 flags.go:64] FLAG: --cloud-provider="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740194 4774 flags.go:64] FLAG: --cluster-dns="[]" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740205 4774 flags.go:64] FLAG: --cluster-domain="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740210 4774 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740215 4774 flags.go:64] FLAG: --config-dir="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740220 4774 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740226 4774 flags.go:64] FLAG: --container-log-max-files="5" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740233 4774 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740238 4774 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740243 4774 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740248 4774 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740252 4774 flags.go:64] FLAG: --contention-profiling="false" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740257 4774 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740262 4774 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740266 4774 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740271 4774 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740278 4774 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740283 4774 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740287 4774 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740293 4774 flags.go:64] FLAG: --enable-load-reader="false" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740298 4774 flags.go:64] FLAG: --enable-server="true" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740303 4774 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740312 4774 flags.go:64] FLAG: --event-burst="100" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740316 4774 flags.go:64] FLAG: --event-qps="50" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740321 4774 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740325 4774 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740329 4774 flags.go:64] FLAG: --eviction-hard="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740340 4774 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740344 4774 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740349 4774 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740354 4774 flags.go:64] FLAG: --eviction-soft="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740359 4774 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740364 4774 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740374 4774 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740381 4774 flags.go:64] FLAG: --experimental-mounter-path="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740401 4774 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740405 4774 flags.go:64] FLAG: --fail-swap-on="true" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740410 4774 flags.go:64] FLAG: --feature-gates="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740416 4774 flags.go:64] FLAG: --file-check-frequency="20s" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740420 4774 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740425 4774 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740431 4774 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740436 4774 flags.go:64] FLAG: --healthz-port="10248" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740441 4774 flags.go:64] FLAG: --help="false" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740446 4774 flags.go:64] FLAG: --hostname-override="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740450 4774 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740455 4774 flags.go:64] FLAG: --http-check-frequency="20s" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740460 4774 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740464 4774 flags.go:64] FLAG: --image-credential-provider-config="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740470 4774 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740474 4774 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740479 4774 flags.go:64] FLAG: --image-service-endpoint="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740483 4774 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740487 4774 flags.go:64] FLAG: --kube-api-burst="100" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740492 4774 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740497 4774 flags.go:64] FLAG: --kube-api-qps="50" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740502 4774 flags.go:64] FLAG: --kube-reserved="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740507 4774 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740512 4774 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740517 4774 flags.go:64] FLAG: --kubelet-cgroups="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740521 4774 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740526 4774 flags.go:64] FLAG: --lock-file="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740531 4774 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740547 4774 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740552 4774 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740560 4774 flags.go:64] FLAG: --log-json-split-stream="false" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740564 4774 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740569 4774 flags.go:64] FLAG: --log-text-split-stream="false" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740573 4774 flags.go:64] FLAG: --logging-format="text" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740578 4774 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740583 4774 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740588 4774 flags.go:64] FLAG: --manifest-url="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740593 4774 flags.go:64] FLAG: --manifest-url-header="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740600 4774 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740605 4774 flags.go:64] FLAG: --max-open-files="1000000" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740611 4774 flags.go:64] FLAG: --max-pods="110" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740616 4774 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740620 4774 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740625 4774 flags.go:64] FLAG: --memory-manager-policy="None" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740629 4774 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740634 4774 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740640 4774 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740645 4774 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740658 4774 flags.go:64] FLAG: --node-status-max-images="50" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740663 4774 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740669 4774 flags.go:64] FLAG: --oom-score-adj="-999" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740674 4774 flags.go:64] FLAG: --pod-cidr="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740678 4774 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740687 4774 flags.go:64] FLAG: --pod-manifest-path="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740691 4774 flags.go:64] FLAG: --pod-max-pids="-1" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740695 4774 flags.go:64] FLAG: --pods-per-core="0" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740700 4774 flags.go:64] FLAG: --port="10250" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740704 4774 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740709 4774 flags.go:64] FLAG: --provider-id="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740713 4774 flags.go:64] FLAG: --qos-reserved="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740718 4774 flags.go:64] FLAG: --read-only-port="10255" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740723 4774 flags.go:64] FLAG: --register-node="true" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740728 4774 flags.go:64] FLAG: --register-schedulable="true" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740733 4774 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740750 4774 flags.go:64] FLAG: --registry-burst="10" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740755 4774 flags.go:64] FLAG: --registry-qps="5" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740759 4774 flags.go:64] FLAG: --reserved-cpus="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740764 4774 flags.go:64] FLAG: --reserved-memory="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740769 4774 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740774 4774 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740778 4774 flags.go:64] FLAG: --rotate-certificates="false" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740783 4774 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740787 4774 flags.go:64] FLAG: --runonce="false" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740792 4774 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740804 4774 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740809 4774 flags.go:64] FLAG: --seccomp-default="false" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740814 4774 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740819 4774 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740825 4774 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740832 4774 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740837 4774 flags.go:64] FLAG: --storage-driver-password="root" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740842 4774 flags.go:64] FLAG: --storage-driver-secure="false" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740847 4774 flags.go:64] FLAG: --storage-driver-table="stats" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740852 4774 flags.go:64] FLAG: --storage-driver-user="root" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740856 4774 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740861 4774 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740865 4774 flags.go:64] FLAG: --system-cgroups="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740870 4774 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740878 4774 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740883 4774 flags.go:64] FLAG: --tls-cert-file="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740887 4774 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740898 4774 flags.go:64] FLAG: --tls-min-version="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740902 4774 flags.go:64] FLAG: --tls-private-key-file="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740906 4774 flags.go:64] FLAG: --topology-manager-policy="none" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740911 4774 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740915 4774 flags.go:64] FLAG: --topology-manager-scope="container" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740919 4774 flags.go:64] FLAG: --v="2" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740926 4774 flags.go:64] FLAG: --version="false" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740932 4774 flags.go:64] FLAG: --vmodule="" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740938 4774 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.740949 4774 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741121 4774 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741127 4774 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741133 4774 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741138 4774 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741143 4774 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741148 4774 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741153 4774 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741157 4774 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741162 4774 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741166 4774 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741172 4774 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741176 4774 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741181 4774 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741185 4774 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741189 4774 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741193 4774 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741197 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741201 4774 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741205 4774 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741209 4774 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741213 4774 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741218 4774 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741222 4774 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741226 4774 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741231 4774 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741235 4774 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741238 4774 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741242 4774 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741247 4774 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741251 4774 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741256 4774 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741259 4774 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741263 4774 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741267 4774 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741271 4774 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741281 4774 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741285 4774 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741289 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741293 4774 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741297 4774 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741302 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741308 4774 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741313 4774 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741319 4774 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741323 4774 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741328 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741332 4774 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741337 4774 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741341 4774 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741346 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741350 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741355 4774 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741360 4774 feature_gate.go:330] unrecognized feature gate: Example Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741366 4774 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741374 4774 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741379 4774 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741398 4774 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741403 4774 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741407 4774 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741411 4774 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741415 4774 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741419 4774 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741423 4774 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741426 4774 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741430 4774 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741434 4774 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741437 4774 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741441 4774 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741445 4774 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741449 4774 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.741453 4774 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.741476 4774 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.778880 4774 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.778922 4774 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779058 4774 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779069 4774 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779079 4774 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779086 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779091 4774 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779095 4774 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779100 4774 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779104 4774 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779108 4774 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779112 4774 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779116 4774 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779120 4774 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779125 4774 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779129 4774 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779133 4774 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779139 4774 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779144 4774 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779148 4774 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779154 4774 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779158 4774 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779163 4774 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779167 4774 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779172 4774 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779176 4774 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779181 4774 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779185 4774 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779190 4774 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779195 4774 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779200 4774 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779211 4774 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779217 4774 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779221 4774 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779226 4774 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779231 4774 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779236 4774 feature_gate.go:330] unrecognized feature gate: Example Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779240 4774 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779244 4774 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779248 4774 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779253 4774 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779257 4774 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779261 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779265 4774 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779269 4774 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779274 4774 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779279 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779284 4774 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779288 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779293 4774 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779297 4774 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779302 4774 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779307 4774 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779311 4774 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779316 4774 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779320 4774 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779324 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779329 4774 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779333 4774 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779338 4774 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779342 4774 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779346 4774 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779350 4774 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779355 4774 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779359 4774 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779363 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779368 4774 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779373 4774 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779381 4774 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779402 4774 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779409 4774 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779414 4774 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779419 4774 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.779427 4774 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779549 4774 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779558 4774 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779563 4774 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779567 4774 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779571 4774 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779575 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779580 4774 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779585 4774 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779591 4774 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779596 4774 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779600 4774 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779606 4774 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779614 4774 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779620 4774 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779625 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779629 4774 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779634 4774 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779640 4774 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779646 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779651 4774 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779655 4774 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779660 4774 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779664 4774 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779668 4774 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779672 4774 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779676 4774 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779681 4774 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779686 4774 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779690 4774 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779695 4774 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779700 4774 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779704 4774 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779708 4774 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779712 4774 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779716 4774 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779720 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779725 4774 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779729 4774 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779733 4774 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779738 4774 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779742 4774 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779746 4774 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779750 4774 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779756 4774 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779763 4774 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779767 4774 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779771 4774 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779775 4774 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779779 4774 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779784 4774 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779789 4774 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779794 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779799 4774 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779805 4774 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779811 4774 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779817 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779822 4774 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779826 4774 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779830 4774 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779834 4774 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779839 4774 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779843 4774 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779847 4774 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779851 4774 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779855 4774 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779859 4774 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779864 4774 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779868 4774 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779872 4774 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779876 4774 feature_gate.go:330] unrecognized feature gate: Example Oct 03 14:42:58 crc kubenswrapper[4774]: W1003 14:42:58.779882 4774 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.779890 4774 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.786026 4774 server.go:940] "Client rotation is on, will bootstrap in background" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.818300 4774 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.818450 4774 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.820733 4774 server.go:997] "Starting client certificate rotation" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.820768 4774 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.827318 4774 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-08 07:23:31.23970615 +0000 UTC Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.827394 4774 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 856h40m32.412315775s for next certificate rotation Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.915944 4774 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.920478 4774 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 03 14:42:58 crc kubenswrapper[4774]: I1003 14:42:58.981075 4774 log.go:25] "Validated CRI v1 runtime API" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.076398 4774 log.go:25] "Validated CRI v1 image API" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.078616 4774 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.100423 4774 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-03-14-38-09-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.100459 4774 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:41 fsType:tmpfs blockSize:0}] Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.117202 4774 manager.go:217] Machine: {Timestamp:2025-10-03 14:42:59.114051581 +0000 UTC m=+1.703255053 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654116352 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464 BootID:1c19104c-b8fa-49cf-91e0-46a9a8f59ee9 Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:41 Capacity:1073741824 Type:vfs Inodes:4108168 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827056128 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:8b:a4:58 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:8b:a4:58 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:0c:48:e3 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:80:6d:d9 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:8b:fc:30 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:75:87:f5 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:2e:2d:25:99:be:81 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:2a:07:91:4c:04:d9 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654116352 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.117525 4774 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.117684 4774 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.118055 4774 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.118237 4774 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.118476 4774 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.118701 4774 topology_manager.go:138] "Creating topology manager with none policy" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.118713 4774 container_manager_linux.go:303] "Creating device plugin manager" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.121791 4774 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.122151 4774 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.122318 4774 state_mem.go:36] "Initialized new in-memory state store" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.122430 4774 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.133607 4774 kubelet.go:418] "Attempting to sync node with API server" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.133664 4774 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.133696 4774 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.133716 4774 kubelet.go:324] "Adding apiserver pod source" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.133745 4774 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 03 14:42:59 crc kubenswrapper[4774]: W1003 14:42:59.140883 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 03 14:42:59 crc kubenswrapper[4774]: E1003 14:42:59.140963 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Oct 03 14:42:59 crc kubenswrapper[4774]: W1003 14:42:59.145472 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 03 14:42:59 crc kubenswrapper[4774]: E1003 14:42:59.145571 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.159988 4774 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.162299 4774 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.172918 4774 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.174476 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.174505 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.174541 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.174554 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.174569 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.174579 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.174588 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.174603 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.174613 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.174622 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.174642 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.174651 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.178349 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.178890 4774 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.179201 4774 server.go:1280] "Started kubelet" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.179455 4774 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.179634 4774 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.180068 4774 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 03 14:42:59 crc systemd[1]: Started Kubernetes Kubelet. Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.182137 4774 server.go:460] "Adding debug handlers to kubelet server" Oct 03 14:42:59 crc kubenswrapper[4774]: E1003 14:42:59.189612 4774 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.32:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186b0246055705e1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-03 14:42:59.178866145 +0000 UTC m=+1.768069597,LastTimestamp:2025-10-03 14:42:59.178866145 +0000 UTC m=+1.768069597,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.192333 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.192463 4774 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.192456 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 13:00:10.41084691 +0000 UTC Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.192497 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1870h17m11.218352312s for next certificate rotation Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.192525 4774 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.192532 4774 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 03 14:42:59 crc kubenswrapper[4774]: E1003 14:42:59.192546 4774 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.192628 4774 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 03 14:42:59 crc kubenswrapper[4774]: W1003 14:42:59.197792 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 03 14:42:59 crc kubenswrapper[4774]: E1003 14:42:59.197902 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.200238 4774 factory.go:55] Registering systemd factory Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.200272 4774 factory.go:221] Registration of the systemd container factory successfully Oct 03 14:42:59 crc kubenswrapper[4774]: E1003 14:42:59.200319 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="200ms" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.200851 4774 factory.go:153] Registering CRI-O factory Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.200884 4774 factory.go:221] Registration of the crio container factory successfully Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.200992 4774 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.201037 4774 factory.go:103] Registering Raw factory Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.201059 4774 manager.go:1196] Started watching for new ooms in manager Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.201762 4774 manager.go:319] Starting recovery of all containers Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.210871 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.210928 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.210953 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.210968 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.210981 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.210994 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211005 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211018 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211031 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211043 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211056 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211076 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211087 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211105 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211118 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211129 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211142 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211155 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211167 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211185 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211198 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211210 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211224 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211240 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211278 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211293 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211315 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211331 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211345 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211358 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211371 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211403 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211419 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211433 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211445 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211457 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211494 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211508 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211524 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211537 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211552 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211564 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211579 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211592 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211614 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211628 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211643 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211656 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211670 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211683 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211695 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211710 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211767 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211786 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211821 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211835 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211848 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211861 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211874 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211886 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.211897 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.213556 4774 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.213821 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.213838 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.213852 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.213864 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.213875 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.213888 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.213899 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.213912 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.213926 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.213938 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.213950 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.213962 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.213976 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.213990 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214003 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214018 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214031 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214043 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214072 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214088 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214102 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214115 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214128 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214141 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214156 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214168 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214183 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214196 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214209 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214222 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214234 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214247 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214264 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214277 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214300 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214314 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214328 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214341 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214356 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214375 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214402 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214414 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214433 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214467 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214484 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214505 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214523 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214540 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214554 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214569 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214612 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214630 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214645 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214658 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214671 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214685 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214698 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214713 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214727 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214743 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214756 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214769 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214784 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214797 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214811 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214826 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214840 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214853 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214915 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214930 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214942 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214955 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.214968 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.215016 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.215032 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.215046 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.215060 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.215073 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.215087 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.215099 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.215113 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.215126 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.215519 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.215568 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.215583 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.215595 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.215608 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.215619 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.215662 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.215674 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.215686 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.215697 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.215735 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.215751 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.215764 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.215775 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.215814 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.215832 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.215846 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.215858 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.215869 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.215909 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.215922 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.215936 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.215949 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216000 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216016 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216028 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216040 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216076 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216091 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216103 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216114 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216125 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216159 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216172 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216184 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216195 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216208 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216254 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216268 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216280 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216291 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216323 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216337 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216349 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216422 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216439 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216451 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216466 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216504 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216517 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216529 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216541 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216551 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216588 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216602 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216613 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216624 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216636 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216673 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216686 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216698 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216710 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216743 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216758 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216772 4774 reconstruct.go:97] "Volume reconstruction finished" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.216781 4774 reconciler.go:26] "Reconciler: start to sync state" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.218982 4774 manager.go:324] Recovery completed Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.227183 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.228873 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.228930 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.228951 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.230030 4774 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.230055 4774 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.230077 4774 state_mem.go:36] "Initialized new in-memory state store" Oct 03 14:42:59 crc kubenswrapper[4774]: E1003 14:42:59.293198 4774 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.296327 4774 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.298054 4774 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.298112 4774 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.298139 4774 kubelet.go:2335] "Starting kubelet main sync loop" Oct 03 14:42:59 crc kubenswrapper[4774]: E1003 14:42:59.298192 4774 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 03 14:42:59 crc kubenswrapper[4774]: W1003 14:42:59.314839 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 03 14:42:59 crc kubenswrapper[4774]: E1003 14:42:59.314909 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Oct 03 14:42:59 crc kubenswrapper[4774]: E1003 14:42:59.393339 4774 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 03 14:42:59 crc kubenswrapper[4774]: E1003 14:42:59.398623 4774 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Oct 03 14:42:59 crc kubenswrapper[4774]: E1003 14:42:59.401645 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="400ms" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.459837 4774 policy_none.go:49] "None policy: Start" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.461978 4774 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.462260 4774 state_mem.go:35] "Initializing new in-memory state store" Oct 03 14:42:59 crc kubenswrapper[4774]: E1003 14:42:59.493553 4774 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 03 14:42:59 crc kubenswrapper[4774]: E1003 14:42:59.594108 4774 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 03 14:42:59 crc kubenswrapper[4774]: E1003 14:42:59.599271 4774 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.606703 4774 manager.go:334] "Starting Device Plugin manager" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.606794 4774 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.606817 4774 server.go:79] "Starting device plugin registration server" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.607609 4774 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.607641 4774 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.608067 4774 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.608196 4774 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.608218 4774 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 03 14:42:59 crc kubenswrapper[4774]: E1003 14:42:59.617691 4774 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.708658 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.710443 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.710489 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.710502 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.710539 4774 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 14:42:59 crc kubenswrapper[4774]: E1003 14:42:59.711143 4774 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.32:6443: connect: connection refused" node="crc" Oct 03 14:42:59 crc kubenswrapper[4774]: E1003 14:42:59.803344 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="800ms" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.911934 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.913858 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.913897 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.913909 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:42:59 crc kubenswrapper[4774]: I1003 14:42:59.913936 4774 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 14:42:59 crc kubenswrapper[4774]: E1003 14:42:59.914544 4774 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.32:6443: connect: connection refused" node="crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.000295 4774 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.000447 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.001618 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.001678 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.001689 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.001897 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.002289 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.002365 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.002997 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.003034 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.003048 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.003149 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.003320 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.003358 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.003709 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.003736 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.003748 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.004271 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.004329 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.004348 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.004284 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.004427 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.004442 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.004593 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.004610 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.004646 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.005577 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.005614 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.005647 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.005662 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.005616 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.005723 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.005814 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.006011 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.006060 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.006738 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.006762 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.006775 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.006807 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.006823 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.006834 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.006921 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.006946 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.007839 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.007877 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.007891 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.128232 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.128287 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.128455 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.128511 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.128587 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.128615 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.128646 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.128669 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.128693 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.128718 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.128740 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.128774 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.128871 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.128930 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.128969 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.180177 4774 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 03 14:43:00 crc kubenswrapper[4774]: W1003 14:43:00.187030 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 03 14:43:00 crc kubenswrapper[4774]: E1003 14:43:00.187126 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.230484 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.230551 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.230599 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.230650 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.230698 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.230723 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.230741 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.230782 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.230805 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.230824 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.230846 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.230889 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.230878 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.230909 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.230956 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.230961 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.230932 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.230823 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.231008 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.231026 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.231030 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.231077 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.231122 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.231135 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.231165 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.231178 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.231207 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.231285 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.231391 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.231318 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.315203 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.316630 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.316694 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.316755 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.316796 4774 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 14:43:00 crc kubenswrapper[4774]: E1003 14:43:00.317513 4774 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.32:6443: connect: connection refused" node="crc" Oct 03 14:43:00 crc kubenswrapper[4774]: W1003 14:43:00.339947 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 03 14:43:00 crc kubenswrapper[4774]: E1003 14:43:00.340051 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.343593 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.363228 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.374169 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.380877 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: I1003 14:43:00.396745 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:43:00 crc kubenswrapper[4774]: W1003 14:43:00.410899 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 03 14:43:00 crc kubenswrapper[4774]: E1003 14:43:00.410978 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Oct 03 14:43:00 crc kubenswrapper[4774]: W1003 14:43:00.471055 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-fd5b20f69f6dcb6e8a3c8f391f979543ea91169dee19a74543f67672b1ea9816 WatchSource:0}: Error finding container fd5b20f69f6dcb6e8a3c8f391f979543ea91169dee19a74543f67672b1ea9816: Status 404 returned error can't find the container with id fd5b20f69f6dcb6e8a3c8f391f979543ea91169dee19a74543f67672b1ea9816 Oct 03 14:43:00 crc kubenswrapper[4774]: W1003 14:43:00.473314 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-1fad84f3f4e241b9bebf859cfa11afa797804a0da8f92638c92e95d1b6e52b4e WatchSource:0}: Error finding container 1fad84f3f4e241b9bebf859cfa11afa797804a0da8f92638c92e95d1b6e52b4e: Status 404 returned error can't find the container with id 1fad84f3f4e241b9bebf859cfa11afa797804a0da8f92638c92e95d1b6e52b4e Oct 03 14:43:00 crc kubenswrapper[4774]: W1003 14:43:00.476742 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-4a12e0b0f06e55459b7776a051a7489016ae1ae042efe29f5273f91190277040 WatchSource:0}: Error finding container 4a12e0b0f06e55459b7776a051a7489016ae1ae042efe29f5273f91190277040: Status 404 returned error can't find the container with id 4a12e0b0f06e55459b7776a051a7489016ae1ae042efe29f5273f91190277040 Oct 03 14:43:00 crc kubenswrapper[4774]: W1003 14:43:00.482821 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 03 14:43:00 crc kubenswrapper[4774]: E1003 14:43:00.482971 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Oct 03 14:43:00 crc kubenswrapper[4774]: E1003 14:43:00.610717 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="1.6s" Oct 03 14:43:01 crc kubenswrapper[4774]: I1003 14:43:01.117665 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:01 crc kubenswrapper[4774]: I1003 14:43:01.118956 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:01 crc kubenswrapper[4774]: I1003 14:43:01.119030 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:01 crc kubenswrapper[4774]: I1003 14:43:01.119056 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:01 crc kubenswrapper[4774]: I1003 14:43:01.119102 4774 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 14:43:01 crc kubenswrapper[4774]: E1003 14:43:01.119846 4774 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.32:6443: connect: connection refused" node="crc" Oct 03 14:43:01 crc kubenswrapper[4774]: I1003 14:43:01.180395 4774 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 03 14:43:01 crc kubenswrapper[4774]: I1003 14:43:01.305895 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8bbbd015c6de9531435805acaa406914b0008c9ee39fddb282e60af616f37070"} Oct 03 14:43:01 crc kubenswrapper[4774]: I1003 14:43:01.307126 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"5016fab626d98552e0a27da03d60642393c7738df795cb81838ebf62fc961d8e"} Oct 03 14:43:01 crc kubenswrapper[4774]: I1003 14:43:01.308161 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4a12e0b0f06e55459b7776a051a7489016ae1ae042efe29f5273f91190277040"} Oct 03 14:43:01 crc kubenswrapper[4774]: I1003 14:43:01.309072 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"fd5b20f69f6dcb6e8a3c8f391f979543ea91169dee19a74543f67672b1ea9816"} Oct 03 14:43:01 crc kubenswrapper[4774]: I1003 14:43:01.309880 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1fad84f3f4e241b9bebf859cfa11afa797804a0da8f92638c92e95d1b6e52b4e"} Oct 03 14:43:02 crc kubenswrapper[4774]: W1003 14:43:02.117537 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 03 14:43:02 crc kubenswrapper[4774]: E1003 14:43:02.118303 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Oct 03 14:43:02 crc kubenswrapper[4774]: I1003 14:43:02.179472 4774 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 03 14:43:02 crc kubenswrapper[4774]: E1003 14:43:02.211587 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="3.2s" Oct 03 14:43:02 crc kubenswrapper[4774]: I1003 14:43:02.317861 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733"} Oct 03 14:43:02 crc kubenswrapper[4774]: I1003 14:43:02.320560 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"272d3589475ea3f31e989b1c7326b814559a8c6cb5f9e5e042ffba28cb1b7c0d"} Oct 03 14:43:02 crc kubenswrapper[4774]: I1003 14:43:02.322815 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb"} Oct 03 14:43:02 crc kubenswrapper[4774]: I1003 14:43:02.324589 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c"} Oct 03 14:43:02 crc kubenswrapper[4774]: I1003 14:43:02.326210 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"79d847934b0a91b6911f58bcf9853e8fe8caeb4c15c07a331a812aa662da7bd4"} Oct 03 14:43:02 crc kubenswrapper[4774]: I1003 14:43:02.720594 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:02 crc kubenswrapper[4774]: I1003 14:43:02.722213 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:02 crc kubenswrapper[4774]: I1003 14:43:02.722281 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:02 crc kubenswrapper[4774]: I1003 14:43:02.722302 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:02 crc kubenswrapper[4774]: I1003 14:43:02.722340 4774 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 14:43:02 crc kubenswrapper[4774]: E1003 14:43:02.723008 4774 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.32:6443: connect: connection refused" node="crc" Oct 03 14:43:03 crc kubenswrapper[4774]: I1003 14:43:03.179310 4774 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 03 14:43:03 crc kubenswrapper[4774]: W1003 14:43:03.266515 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 03 14:43:03 crc kubenswrapper[4774]: E1003 14:43:03.267196 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Oct 03 14:43:03 crc kubenswrapper[4774]: I1003 14:43:03.330986 4774 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733" exitCode=0 Oct 03 14:43:03 crc kubenswrapper[4774]: I1003 14:43:03.331162 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733"} Oct 03 14:43:03 crc kubenswrapper[4774]: I1003 14:43:03.331333 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:03 crc kubenswrapper[4774]: I1003 14:43:03.332651 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:03 crc kubenswrapper[4774]: I1003 14:43:03.332702 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:03 crc kubenswrapper[4774]: I1003 14:43:03.332715 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:03 crc kubenswrapper[4774]: I1003 14:43:03.333600 4774 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="272d3589475ea3f31e989b1c7326b814559a8c6cb5f9e5e042ffba28cb1b7c0d" exitCode=0 Oct 03 14:43:03 crc kubenswrapper[4774]: I1003 14:43:03.333726 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"272d3589475ea3f31e989b1c7326b814559a8c6cb5f9e5e042ffba28cb1b7c0d"} Oct 03 14:43:03 crc kubenswrapper[4774]: I1003 14:43:03.333745 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:03 crc kubenswrapper[4774]: I1003 14:43:03.335182 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:03 crc kubenswrapper[4774]: I1003 14:43:03.335222 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:03 crc kubenswrapper[4774]: I1003 14:43:03.335234 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:03 crc kubenswrapper[4774]: I1003 14:43:03.336489 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e"} Oct 03 14:43:03 crc kubenswrapper[4774]: I1003 14:43:03.339335 4774 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c" exitCode=0 Oct 03 14:43:03 crc kubenswrapper[4774]: I1003 14:43:03.339396 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c"} Oct 03 14:43:03 crc kubenswrapper[4774]: I1003 14:43:03.339450 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:03 crc kubenswrapper[4774]: I1003 14:43:03.341220 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:03 crc kubenswrapper[4774]: I1003 14:43:03.341330 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:03 crc kubenswrapper[4774]: I1003 14:43:03.341432 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:03 crc kubenswrapper[4774]: I1003 14:43:03.341716 4774 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="79d847934b0a91b6911f58bcf9853e8fe8caeb4c15c07a331a812aa662da7bd4" exitCode=0 Oct 03 14:43:03 crc kubenswrapper[4774]: I1003 14:43:03.341768 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"79d847934b0a91b6911f58bcf9853e8fe8caeb4c15c07a331a812aa662da7bd4"} Oct 03 14:43:03 crc kubenswrapper[4774]: I1003 14:43:03.341847 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:03 crc kubenswrapper[4774]: I1003 14:43:03.343500 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:03 crc kubenswrapper[4774]: I1003 14:43:03.343546 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:03 crc kubenswrapper[4774]: I1003 14:43:03.343644 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:03 crc kubenswrapper[4774]: I1003 14:43:03.347033 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:03 crc kubenswrapper[4774]: I1003 14:43:03.348193 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:03 crc kubenswrapper[4774]: I1003 14:43:03.348237 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:03 crc kubenswrapper[4774]: I1003 14:43:03.348252 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:03 crc kubenswrapper[4774]: W1003 14:43:03.398562 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 03 14:43:03 crc kubenswrapper[4774]: E1003 14:43:03.398678 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Oct 03 14:43:03 crc kubenswrapper[4774]: E1003 14:43:03.421709 4774 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.32:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186b0246055705e1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-03 14:42:59.178866145 +0000 UTC m=+1.768069597,LastTimestamp:2025-10-03 14:42:59.178866145 +0000 UTC m=+1.768069597,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 03 14:43:03 crc kubenswrapper[4774]: W1003 14:43:03.443907 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 03 14:43:03 crc kubenswrapper[4774]: E1003 14:43:03.443987 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Oct 03 14:43:04 crc kubenswrapper[4774]: I1003 14:43:04.180759 4774 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 03 14:43:04 crc kubenswrapper[4774]: I1003 14:43:04.345732 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"4c584f9a25d7b0b01f007c91f9a0ccca0af44f8a71f933e56cbdcaf38e316481"} Oct 03 14:43:04 crc kubenswrapper[4774]: I1003 14:43:04.345765 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:04 crc kubenswrapper[4774]: I1003 14:43:04.346860 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:04 crc kubenswrapper[4774]: I1003 14:43:04.346897 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:04 crc kubenswrapper[4774]: I1003 14:43:04.346913 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:04 crc kubenswrapper[4774]: I1003 14:43:04.348027 4774 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc" exitCode=0 Oct 03 14:43:04 crc kubenswrapper[4774]: I1003 14:43:04.348107 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc"} Oct 03 14:43:04 crc kubenswrapper[4774]: I1003 14:43:04.348220 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:04 crc kubenswrapper[4774]: I1003 14:43:04.349302 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:04 crc kubenswrapper[4774]: I1003 14:43:04.349346 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:04 crc kubenswrapper[4774]: I1003 14:43:04.349364 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:04 crc kubenswrapper[4774]: I1003 14:43:04.350587 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d9a653e60009bbcb51465ae267a42828fbf1e9863cc50c73ab2f3b490b5f1524"} Oct 03 14:43:04 crc kubenswrapper[4774]: I1003 14:43:04.353801 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd"} Oct 03 14:43:04 crc kubenswrapper[4774]: I1003 14:43:04.356912 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6"} Oct 03 14:43:05 crc kubenswrapper[4774]: I1003 14:43:05.180249 4774 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 03 14:43:05 crc kubenswrapper[4774]: I1003 14:43:05.361716 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133"} Oct 03 14:43:05 crc kubenswrapper[4774]: I1003 14:43:05.361803 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e"} Oct 03 14:43:05 crc kubenswrapper[4774]: I1003 14:43:05.363405 4774 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319" exitCode=0 Oct 03 14:43:05 crc kubenswrapper[4774]: I1003 14:43:05.363467 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319"} Oct 03 14:43:05 crc kubenswrapper[4774]: I1003 14:43:05.363490 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:05 crc kubenswrapper[4774]: I1003 14:43:05.364437 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:05 crc kubenswrapper[4774]: I1003 14:43:05.364474 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:05 crc kubenswrapper[4774]: I1003 14:43:05.364485 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:05 crc kubenswrapper[4774]: I1003 14:43:05.366003 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5e63db8eac87987c01cd81f4cc676a2bb7268ee091097ce8b1ed6a14bc3f2346"} Oct 03 14:43:05 crc kubenswrapper[4774]: I1003 14:43:05.366246 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4623e559e31f88609bb4a406374a3ce997e19ca75d71bc291cb98ed209f1e2db"} Oct 03 14:43:05 crc kubenswrapper[4774]: I1003 14:43:05.366349 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:05 crc kubenswrapper[4774]: I1003 14:43:05.368628 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:05 crc kubenswrapper[4774]: I1003 14:43:05.368658 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:05 crc kubenswrapper[4774]: I1003 14:43:05.368670 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:05 crc kubenswrapper[4774]: I1003 14:43:05.371048 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:05 crc kubenswrapper[4774]: I1003 14:43:05.371452 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:05 crc kubenswrapper[4774]: I1003 14:43:05.371556 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e"} Oct 03 14:43:05 crc kubenswrapper[4774]: I1003 14:43:05.371977 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:05 crc kubenswrapper[4774]: I1003 14:43:05.372004 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:05 crc kubenswrapper[4774]: I1003 14:43:05.372037 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:05 crc kubenswrapper[4774]: I1003 14:43:05.372509 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:05 crc kubenswrapper[4774]: I1003 14:43:05.372558 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:05 crc kubenswrapper[4774]: I1003 14:43:05.372577 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:05 crc kubenswrapper[4774]: E1003 14:43:05.413350 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="6.4s" Oct 03 14:43:05 crc kubenswrapper[4774]: I1003 14:43:05.923899 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:05 crc kubenswrapper[4774]: I1003 14:43:05.925565 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:05 crc kubenswrapper[4774]: I1003 14:43:05.925620 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:05 crc kubenswrapper[4774]: I1003 14:43:05.925677 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:05 crc kubenswrapper[4774]: I1003 14:43:05.925708 4774 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 14:43:05 crc kubenswrapper[4774]: E1003 14:43:05.926203 4774 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.32:6443: connect: connection refused" node="crc" Oct 03 14:43:06 crc kubenswrapper[4774]: I1003 14:43:06.180053 4774 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 03 14:43:06 crc kubenswrapper[4774]: I1003 14:43:06.377489 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2975bb7e22ba015f554a696e0f604a9feb29085aac4af67c2727fc040373f54e"} Oct 03 14:43:06 crc kubenswrapper[4774]: I1003 14:43:06.377563 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1"} Oct 03 14:43:06 crc kubenswrapper[4774]: I1003 14:43:06.381604 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa"} Oct 03 14:43:06 crc kubenswrapper[4774]: I1003 14:43:06.381684 4774 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 14:43:06 crc kubenswrapper[4774]: I1003 14:43:06.381725 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:06 crc kubenswrapper[4774]: I1003 14:43:06.381761 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:06 crc kubenswrapper[4774]: I1003 14:43:06.383038 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:06 crc kubenswrapper[4774]: I1003 14:43:06.383078 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:06 crc kubenswrapper[4774]: I1003 14:43:06.383094 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:06 crc kubenswrapper[4774]: I1003 14:43:06.383038 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:06 crc kubenswrapper[4774]: I1003 14:43:06.383166 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:06 crc kubenswrapper[4774]: I1003 14:43:06.383182 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:06 crc kubenswrapper[4774]: I1003 14:43:06.487823 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 14:43:06 crc kubenswrapper[4774]: I1003 14:43:06.488333 4774 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": dial tcp 192.168.126.11:10357: connect: connection refused" start-of-body= Oct 03 14:43:06 crc kubenswrapper[4774]: I1003 14:43:06.488416 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": dial tcp 192.168.126.11:10357: connect: connection refused" Oct 03 14:43:06 crc kubenswrapper[4774]: W1003 14:43:06.608797 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 03 14:43:06 crc kubenswrapper[4774]: E1003 14:43:06.608879 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Oct 03 14:43:06 crc kubenswrapper[4774]: I1003 14:43:06.819006 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 14:43:07 crc kubenswrapper[4774]: I1003 14:43:07.180040 4774 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 03 14:43:07 crc kubenswrapper[4774]: W1003 14:43:07.375153 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 03 14:43:07 crc kubenswrapper[4774]: E1003 14:43:07.375255 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Oct 03 14:43:07 crc kubenswrapper[4774]: I1003 14:43:07.387557 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251"} Oct 03 14:43:07 crc kubenswrapper[4774]: I1003 14:43:07.387608 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009"} Oct 03 14:43:07 crc kubenswrapper[4774]: I1003 14:43:07.387612 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:07 crc kubenswrapper[4774]: I1003 14:43:07.387623 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73"} Oct 03 14:43:07 crc kubenswrapper[4774]: I1003 14:43:07.387672 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:07 crc kubenswrapper[4774]: I1003 14:43:07.387721 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:07 crc kubenswrapper[4774]: I1003 14:43:07.388696 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:07 crc kubenswrapper[4774]: I1003 14:43:07.388727 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:07 crc kubenswrapper[4774]: I1003 14:43:07.388737 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:07 crc kubenswrapper[4774]: I1003 14:43:07.388993 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:07 crc kubenswrapper[4774]: I1003 14:43:07.389019 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:07 crc kubenswrapper[4774]: I1003 14:43:07.389034 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:07 crc kubenswrapper[4774]: I1003 14:43:07.389504 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:07 crc kubenswrapper[4774]: I1003 14:43:07.389548 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:07 crc kubenswrapper[4774]: I1003 14:43:07.389564 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:08 crc kubenswrapper[4774]: I1003 14:43:08.180186 4774 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 03 14:43:08 crc kubenswrapper[4774]: I1003 14:43:08.393686 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687"} Oct 03 14:43:08 crc kubenswrapper[4774]: I1003 14:43:08.393783 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:08 crc kubenswrapper[4774]: I1003 14:43:08.394505 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:08 crc kubenswrapper[4774]: I1003 14:43:08.394537 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:08 crc kubenswrapper[4774]: I1003 14:43:08.394550 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:09 crc kubenswrapper[4774]: W1003 14:43:09.112689 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 03 14:43:09 crc kubenswrapper[4774]: E1003 14:43:09.112814 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Oct 03 14:43:09 crc kubenswrapper[4774]: I1003 14:43:09.168325 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:43:09 crc kubenswrapper[4774]: I1003 14:43:09.168595 4774 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 14:43:09 crc kubenswrapper[4774]: I1003 14:43:09.168655 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:09 crc kubenswrapper[4774]: I1003 14:43:09.169986 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:09 crc kubenswrapper[4774]: I1003 14:43:09.170029 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:09 crc kubenswrapper[4774]: I1003 14:43:09.170039 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:09 crc kubenswrapper[4774]: I1003 14:43:09.180284 4774 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 03 14:43:09 crc kubenswrapper[4774]: W1003 14:43:09.211890 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Oct 03 14:43:09 crc kubenswrapper[4774]: E1003 14:43:09.212038 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Oct 03 14:43:09 crc kubenswrapper[4774]: I1003 14:43:09.396636 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 03 14:43:09 crc kubenswrapper[4774]: I1003 14:43:09.398663 4774 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2975bb7e22ba015f554a696e0f604a9feb29085aac4af67c2727fc040373f54e" exitCode=255 Oct 03 14:43:09 crc kubenswrapper[4774]: I1003 14:43:09.398752 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2975bb7e22ba015f554a696e0f604a9feb29085aac4af67c2727fc040373f54e"} Oct 03 14:43:09 crc kubenswrapper[4774]: I1003 14:43:09.398927 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:09 crc kubenswrapper[4774]: I1003 14:43:09.399055 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:09 crc kubenswrapper[4774]: I1003 14:43:09.399813 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:09 crc kubenswrapper[4774]: I1003 14:43:09.399849 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:09 crc kubenswrapper[4774]: I1003 14:43:09.399862 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:09 crc kubenswrapper[4774]: I1003 14:43:09.400113 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:09 crc kubenswrapper[4774]: I1003 14:43:09.400160 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:09 crc kubenswrapper[4774]: I1003 14:43:09.400171 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:09 crc kubenswrapper[4774]: I1003 14:43:09.400344 4774 scope.go:117] "RemoveContainer" containerID="2975bb7e22ba015f554a696e0f604a9feb29085aac4af67c2727fc040373f54e" Oct 03 14:43:09 crc kubenswrapper[4774]: I1003 14:43:09.565740 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:43:09 crc kubenswrapper[4774]: E1003 14:43:09.617792 4774 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 03 14:43:09 crc kubenswrapper[4774]: I1003 14:43:09.878582 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 14:43:09 crc kubenswrapper[4774]: I1003 14:43:09.878960 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:09 crc kubenswrapper[4774]: I1003 14:43:09.880479 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:09 crc kubenswrapper[4774]: I1003 14:43:09.880521 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:09 crc kubenswrapper[4774]: I1003 14:43:09.880535 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:09 crc kubenswrapper[4774]: I1003 14:43:09.885646 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 14:43:10 crc kubenswrapper[4774]: I1003 14:43:10.401781 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:10 crc kubenswrapper[4774]: I1003 14:43:10.402638 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:10 crc kubenswrapper[4774]: I1003 14:43:10.402664 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:10 crc kubenswrapper[4774]: I1003 14:43:10.402672 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:10 crc kubenswrapper[4774]: I1003 14:43:10.590435 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:43:11 crc kubenswrapper[4774]: I1003 14:43:11.406064 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 03 14:43:11 crc kubenswrapper[4774]: I1003 14:43:11.407773 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa"} Oct 03 14:43:11 crc kubenswrapper[4774]: I1003 14:43:11.407915 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:11 crc kubenswrapper[4774]: I1003 14:43:11.408905 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:11 crc kubenswrapper[4774]: I1003 14:43:11.408948 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:11 crc kubenswrapper[4774]: I1003 14:43:11.408960 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:11 crc kubenswrapper[4774]: I1003 14:43:11.425663 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 14:43:11 crc kubenswrapper[4774]: I1003 14:43:11.425831 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:11 crc kubenswrapper[4774]: I1003 14:43:11.426775 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:11 crc kubenswrapper[4774]: I1003 14:43:11.426809 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:11 crc kubenswrapper[4774]: I1003 14:43:11.426823 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:11 crc kubenswrapper[4774]: I1003 14:43:11.476088 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 14:43:11 crc kubenswrapper[4774]: I1003 14:43:11.527467 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 03 14:43:11 crc kubenswrapper[4774]: I1003 14:43:11.528107 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:11 crc kubenswrapper[4774]: I1003 14:43:11.529423 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:11 crc kubenswrapper[4774]: I1003 14:43:11.529492 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:11 crc kubenswrapper[4774]: I1003 14:43:11.529517 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:12 crc kubenswrapper[4774]: I1003 14:43:12.326823 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:12 crc kubenswrapper[4774]: I1003 14:43:12.328174 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:12 crc kubenswrapper[4774]: I1003 14:43:12.328225 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:12 crc kubenswrapper[4774]: I1003 14:43:12.328243 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:12 crc kubenswrapper[4774]: I1003 14:43:12.328275 4774 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 14:43:12 crc kubenswrapper[4774]: I1003 14:43:12.410288 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:12 crc kubenswrapper[4774]: I1003 14:43:12.410364 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:12 crc kubenswrapper[4774]: I1003 14:43:12.410566 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:43:12 crc kubenswrapper[4774]: I1003 14:43:12.411394 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:12 crc kubenswrapper[4774]: I1003 14:43:12.411433 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:12 crc kubenswrapper[4774]: I1003 14:43:12.411446 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:12 crc kubenswrapper[4774]: I1003 14:43:12.411902 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:12 crc kubenswrapper[4774]: I1003 14:43:12.411940 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:12 crc kubenswrapper[4774]: I1003 14:43:12.411955 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:12 crc kubenswrapper[4774]: I1003 14:43:12.413954 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 14:43:13 crc kubenswrapper[4774]: I1003 14:43:13.411972 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:13 crc kubenswrapper[4774]: I1003 14:43:13.411972 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:13 crc kubenswrapper[4774]: I1003 14:43:13.412868 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:13 crc kubenswrapper[4774]: I1003 14:43:13.412895 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:13 crc kubenswrapper[4774]: I1003 14:43:13.412906 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:13 crc kubenswrapper[4774]: I1003 14:43:13.413014 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:13 crc kubenswrapper[4774]: I1003 14:43:13.413053 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:13 crc kubenswrapper[4774]: I1003 14:43:13.413065 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:14 crc kubenswrapper[4774]: I1003 14:43:14.607154 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 03 14:43:14 crc kubenswrapper[4774]: I1003 14:43:14.607513 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:14 crc kubenswrapper[4774]: I1003 14:43:14.609221 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:14 crc kubenswrapper[4774]: I1003 14:43:14.609256 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:14 crc kubenswrapper[4774]: I1003 14:43:14.609267 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:16 crc kubenswrapper[4774]: I1003 14:43:16.906925 4774 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 03 14:43:16 crc kubenswrapper[4774]: I1003 14:43:16.906998 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 03 14:43:16 crc kubenswrapper[4774]: I1003 14:43:16.911752 4774 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 03 14:43:16 crc kubenswrapper[4774]: I1003 14:43:16.912020 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 03 14:43:19 crc kubenswrapper[4774]: I1003 14:43:19.442087 4774 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 03 14:43:19 crc kubenswrapper[4774]: I1003 14:43:19.488794 4774 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 03 14:43:19 crc kubenswrapper[4774]: I1003 14:43:19.488876 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 03 14:43:19 crc kubenswrapper[4774]: E1003 14:43:19.617869 4774 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 03 14:43:20 crc kubenswrapper[4774]: I1003 14:43:20.594814 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:43:20 crc kubenswrapper[4774]: I1003 14:43:20.594980 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:20 crc kubenswrapper[4774]: I1003 14:43:20.595323 4774 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 03 14:43:20 crc kubenswrapper[4774]: I1003 14:43:20.595383 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 03 14:43:20 crc kubenswrapper[4774]: I1003 14:43:20.595917 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:20 crc kubenswrapper[4774]: I1003 14:43:20.595952 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:20 crc kubenswrapper[4774]: I1003 14:43:20.595963 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:20 crc kubenswrapper[4774]: I1003 14:43:20.600686 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.432593 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.433055 4774 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.433122 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.433684 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.433732 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.433747 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:21 crc kubenswrapper[4774]: E1003 14:43:21.901459 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="7s" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.902185 4774 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.904241 4774 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.904426 4774 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.916909 4774 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.925463 4774 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.925730 4774 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.926972 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.927230 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.927321 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.927431 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.927527 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:21Z","lastTransitionTime":"2025-10-03T14:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:21 crc kubenswrapper[4774]: E1003 14:43:21.939822 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.943628 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.943841 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.943938 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.944026 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.944102 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:21Z","lastTransitionTime":"2025-10-03T14:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:21 crc kubenswrapper[4774]: E1003 14:43:21.954390 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.957279 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.957429 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.957525 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.957606 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.957667 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:21Z","lastTransitionTime":"2025-10-03T14:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:21 crc kubenswrapper[4774]: E1003 14:43:21.966987 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.969899 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.970069 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.970137 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.970207 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.970271 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:21Z","lastTransitionTime":"2025-10-03T14:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:21 crc kubenswrapper[4774]: E1003 14:43:21.979440 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.981995 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.982027 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.982040 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.982058 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.982071 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:21Z","lastTransitionTime":"2025-10-03T14:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:21 crc kubenswrapper[4774]: E1003 14:43:21.991074 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:43:21 crc kubenswrapper[4774]: E1003 14:43:21.991550 4774 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.995427 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.995621 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.995633 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.995646 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:21 crc kubenswrapper[4774]: I1003 14:43:21.995655 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:21Z","lastTransitionTime":"2025-10-03T14:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.097985 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.098019 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.098028 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.098044 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.098055 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:22Z","lastTransitionTime":"2025-10-03T14:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.148655 4774 apiserver.go:52] "Watching apiserver" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.150663 4774 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.151002 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.151482 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.151726 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.152010 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:43:22 crc kubenswrapper[4774]: E1003 14:43:22.152060 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:43:22 crc kubenswrapper[4774]: E1003 14:43:22.152109 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.152184 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.152209 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.152365 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:43:22 crc kubenswrapper[4774]: E1003 14:43:22.152451 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.153236 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.154354 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.154547 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.154740 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.154986 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.155961 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.156056 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.156082 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.156124 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.187016 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.193573 4774 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.200089 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.200733 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.200770 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.200782 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.200798 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.200809 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:22Z","lastTransitionTime":"2025-10-03T14:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.209148 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.209214 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.209234 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.209250 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.209285 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.209304 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.209326 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.209357 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.209403 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.209420 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.209437 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.209508 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.209551 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.209589 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.209626 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.209663 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.209680 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.209696 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.209711 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.209747 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.209763 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.209778 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.209796 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.209831 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.209866 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.209906 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.209899 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.209922 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.209939 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.209954 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.209991 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210008 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210023 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210040 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210074 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210091 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210107 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210121 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210153 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210168 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210169 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210178 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210184 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210234 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210306 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210310 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210332 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210255 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210465 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210479 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210485 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210557 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210549 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210580 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210607 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210628 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210594 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210645 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210660 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210675 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210691 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210707 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210722 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210738 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210753 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210768 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210788 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210803 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210817 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210833 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210847 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210861 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210876 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210892 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210906 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210921 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210936 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210949 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210963 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210976 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210992 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211009 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211025 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211040 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211056 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211070 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211087 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211103 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211123 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211147 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211168 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211188 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211209 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211257 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211289 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211355 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211391 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211406 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211426 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211443 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211459 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211475 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211491 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211508 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211524 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211541 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211557 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211574 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211588 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211607 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211626 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211643 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211660 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211676 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211692 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211708 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211724 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211738 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211753 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211796 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211811 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211828 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211868 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211886 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211902 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211918 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212019 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212035 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212051 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212067 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212082 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212097 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212114 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212129 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212144 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212159 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212175 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212191 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212208 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212229 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212250 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212269 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212284 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212299 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212315 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212330 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212346 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212360 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212389 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212456 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212480 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212496 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212516 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212533 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212553 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212569 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212585 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212603 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212621 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212640 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212657 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212672 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212688 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212757 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212776 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212792 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212808 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212825 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212842 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212862 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212879 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212895 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212910 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212925 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212943 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212960 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212976 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212994 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.213010 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.213040 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.213058 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.213088 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.213115 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.213483 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.213509 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.213536 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.213555 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.213572 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.213587 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.213608 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.213627 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.213648 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.213668 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.213690 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.213711 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.213734 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.213757 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.213778 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.213799 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.213821 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.213843 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.213862 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.213879 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.213897 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.213937 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.213959 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.213981 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.214001 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.214023 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.214044 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.214068 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.214755 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.214806 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.214839 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.214874 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.214902 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.214940 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.214971 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.215154 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.215172 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.215189 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.215205 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.215441 4774 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.215461 4774 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.215475 4774 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.215489 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.215509 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.216252 4774 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.216572 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.218010 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.210973 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.219694 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.219758 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211068 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211063 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211317 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211329 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211441 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.222452 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211579 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211815 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211937 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212511 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212725 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212923 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.212922 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.213031 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.213131 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.213456 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.214725 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.214741 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.214044 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.214802 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.214997 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.215083 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.215349 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.215542 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.215769 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.215775 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.215571 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.215922 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.216086 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.216091 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.216258 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.216522 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.216541 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.216860 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.216945 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.217083 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.217237 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.215211 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: E1003 14:43:22.217819 4774 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.217884 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.220339 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.220629 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.221194 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.221239 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.221247 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.221495 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.221523 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.221560 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: E1003 14:43:22.221691 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:43:22.721672719 +0000 UTC m=+25.310876171 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.222699 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.222710 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.222725 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.222980 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.222997 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.222937 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.223012 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.223516 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.223987 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.223888 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.224240 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.224275 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.221756 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.221797 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.221875 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.221882 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.222067 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.222161 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.222120 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.222213 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.211012 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.222218 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.222268 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.222279 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.224647 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.225503 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.225507 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.226042 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.226316 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.227998 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.228162 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.228166 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.228442 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.228470 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.228336 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.228482 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.228642 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.228825 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.228833 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.229049 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.229267 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.229311 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.229461 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.229474 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: E1003 14:43:22.229547 4774 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.229595 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.229641 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.229644 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.229086 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.229738 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.229750 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.229994 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.229848 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.230050 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.230094 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.230442 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.230496 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.230522 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.230666 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.230709 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.230814 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.230860 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.231109 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.231287 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.231302 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.231500 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.231597 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.231633 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.231741 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.231040 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.232027 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.232255 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.232335 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.228971 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.232446 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.232648 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.232661 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.233067 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.233106 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.233536 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.233738 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.233752 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.233808 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.234070 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.235700 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:43:22 crc kubenswrapper[4774]: E1003 14:43:22.238253 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 14:43:22.738229654 +0000 UTC m=+25.327433106 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 14:43:22 crc kubenswrapper[4774]: E1003 14:43:22.238288 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 14:43:22.738278335 +0000 UTC m=+25.327481787 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.244475 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:43:22 crc kubenswrapper[4774]: E1003 14:43:22.246509 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 14:43:22 crc kubenswrapper[4774]: E1003 14:43:22.246663 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 14:43:22 crc kubenswrapper[4774]: E1003 14:43:22.246857 4774 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.246882 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: E1003 14:43:22.246940 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 14:43:22.746920146 +0000 UTC m=+25.336123598 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:43:22 crc kubenswrapper[4774]: E1003 14:43:22.246576 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 14:43:22 crc kubenswrapper[4774]: E1003 14:43:22.246966 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 14:43:22 crc kubenswrapper[4774]: E1003 14:43:22.246975 4774 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:43:22 crc kubenswrapper[4774]: E1003 14:43:22.247003 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 14:43:22.746995148 +0000 UTC m=+25.336198600 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.247195 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.224425 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.247459 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.247479 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.247847 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.248772 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.248818 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.249707 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.249755 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.249935 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.250339 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.250603 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.250705 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.250758 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.250844 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.250847 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.250903 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.252002 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.252272 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.252962 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.255800 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.255911 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.256078 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.256235 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.256320 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.256460 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.256846 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.258459 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.258948 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.258970 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.259143 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.260086 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.260225 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.260803 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.260897 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.260910 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.261128 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.261166 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.261302 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.261390 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.261145 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.261675 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.261814 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.261860 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.262092 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.262292 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.262350 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.262431 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.264562 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.264707 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.265141 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.265933 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.266024 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.266589 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.266949 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.267203 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.268505 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.272934 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.284853 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.286676 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.303284 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.303338 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.303348 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.303360 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.303384 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:22Z","lastTransitionTime":"2025-10-03T14:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.316206 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.316387 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.316440 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.316513 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.316753 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.316783 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.316797 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.316808 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.316820 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.316833 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.316845 4774 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.316857 4774 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.316868 4774 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.316880 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.316891 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.316901 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.316912 4774 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.316923 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.316937 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.316949 4774 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.316961 4774 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.316974 4774 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.316986 4774 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.316996 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317008 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317018 4774 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317028 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317038 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317047 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317055 4774 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317063 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317071 4774 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317079 4774 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317087 4774 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317095 4774 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317103 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317112 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317121 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317129 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317137 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317145 4774 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317154 4774 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317162 4774 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317171 4774 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317180 4774 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317188 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317196 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317203 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317211 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317221 4774 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317229 4774 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317238 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317246 4774 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317256 4774 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317263 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317272 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317280 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317288 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317295 4774 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317305 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317313 4774 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317321 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317329 4774 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317337 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317346 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317354 4774 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317362 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317402 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317413 4774 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317423 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317434 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317445 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317454 4774 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317462 4774 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317471 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317480 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317488 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317497 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317504 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317512 4774 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317520 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317538 4774 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317546 4774 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317553 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317562 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317570 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317577 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317585 4774 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317594 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317602 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317610 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317617 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317625 4774 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317633 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317641 4774 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317649 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317657 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317665 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317672 4774 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317680 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317688 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317696 4774 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317705 4774 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317712 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317720 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317730 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317740 4774 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317748 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317756 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317764 4774 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317772 4774 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317779 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317787 4774 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317794 4774 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317802 4774 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317811 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317824 4774 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317832 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317840 4774 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317847 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317855 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317863 4774 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317871 4774 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317886 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317893 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317901 4774 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317909 4774 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317923 4774 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317931 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317938 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317947 4774 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317956 4774 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317965 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317973 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317981 4774 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317989 4774 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.317996 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318004 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318012 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318020 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318029 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318036 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318044 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318051 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318059 4774 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318073 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318082 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318090 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318099 4774 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318106 4774 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318114 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318122 4774 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318129 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318137 4774 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318145 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318152 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318160 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318167 4774 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318175 4774 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318184 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318191 4774 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318210 4774 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318218 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318226 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318235 4774 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318243 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318251 4774 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318259 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318268 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318276 4774 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318284 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318291 4774 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318299 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318307 4774 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318314 4774 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318323 4774 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318330 4774 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318338 4774 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318345 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318353 4774 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318361 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318385 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318397 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318406 4774 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318414 4774 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318422 4774 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318430 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318437 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.318446 4774 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.408562 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.408610 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.408627 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.408648 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.408665 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:22Z","lastTransitionTime":"2025-10-03T14:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.435607 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.435971 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.437201 4774 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa" exitCode=255 Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.437228 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa"} Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.437265 4774 scope.go:117] "RemoveContainer" containerID="2975bb7e22ba015f554a696e0f604a9feb29085aac4af67c2727fc040373f54e" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.460923 4774 scope.go:117] "RemoveContainer" containerID="91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa" Oct 03 14:43:22 crc kubenswrapper[4774]: E1003 14:43:22.461217 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.463805 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.476802 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:43:22 crc kubenswrapper[4774]: W1003 14:43:22.477475 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-7117e8563773b705ff9ad87fd9b0eeb05b05b3aebcbcb6b7eb2deefe1b8cbf3b WatchSource:0}: Error finding container 7117e8563773b705ff9ad87fd9b0eeb05b05b3aebcbcb6b7eb2deefe1b8cbf3b: Status 404 returned error can't find the container with id 7117e8563773b705ff9ad87fd9b0eeb05b05b3aebcbcb6b7eb2deefe1b8cbf3b Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.479580 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.490555 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.490620 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.492088 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.498659 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.504826 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.511022 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.511068 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.511078 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.511096 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.511106 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:22Z","lastTransitionTime":"2025-10-03T14:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.514748 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:43:22 crc kubenswrapper[4774]: W1003 14:43:22.518224 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-4300fc330508eef2e7d6bf4d39b07b0faabf46456f8d4fd9914c0f7bffd0d8b0 WatchSource:0}: Error finding container 4300fc330508eef2e7d6bf4d39b07b0faabf46456f8d4fd9914c0f7bffd0d8b0: Status 404 returned error can't find the container with id 4300fc330508eef2e7d6bf4d39b07b0faabf46456f8d4fd9914c0f7bffd0d8b0 Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.525460 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.535685 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.615775 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.615831 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.615845 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.615863 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.615880 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:22Z","lastTransitionTime":"2025-10-03T14:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.718443 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.718495 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.718507 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.718525 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.718535 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:22Z","lastTransitionTime":"2025-10-03T14:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.721780 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:43:22 crc kubenswrapper[4774]: E1003 14:43:22.721939 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:43:23.721916034 +0000 UTC m=+26.311119486 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.820723 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.820771 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.820785 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.820802 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.820812 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:22Z","lastTransitionTime":"2025-10-03T14:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.823151 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.823194 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.823222 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.823248 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:43:22 crc kubenswrapper[4774]: E1003 14:43:22.823346 4774 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 14:43:22 crc kubenswrapper[4774]: E1003 14:43:22.823420 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 14:43:23.823405274 +0000 UTC m=+26.412608736 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 14:43:22 crc kubenswrapper[4774]: E1003 14:43:22.823632 4774 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 14:43:22 crc kubenswrapper[4774]: E1003 14:43:22.823715 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 14:43:23.823694581 +0000 UTC m=+26.412898033 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 14:43:22 crc kubenswrapper[4774]: E1003 14:43:22.823631 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 14:43:22 crc kubenswrapper[4774]: E1003 14:43:22.823754 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 14:43:22 crc kubenswrapper[4774]: E1003 14:43:22.823768 4774 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:43:22 crc kubenswrapper[4774]: E1003 14:43:22.823802 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 14:43:23.823793734 +0000 UTC m=+26.412997256 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:43:22 crc kubenswrapper[4774]: E1003 14:43:22.823631 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 14:43:22 crc kubenswrapper[4774]: E1003 14:43:22.823829 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 14:43:22 crc kubenswrapper[4774]: E1003 14:43:22.823839 4774 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:43:22 crc kubenswrapper[4774]: E1003 14:43:22.823910 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 14:43:23.823901956 +0000 UTC m=+26.413105488 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.924069 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.924138 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.924156 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.924176 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:22 crc kubenswrapper[4774]: I1003 14:43:22.924190 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:22Z","lastTransitionTime":"2025-10-03T14:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.026203 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.026241 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.026249 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.026264 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.026273 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:23Z","lastTransitionTime":"2025-10-03T14:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.128788 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.128830 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.128839 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.128855 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.128865 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:23Z","lastTransitionTime":"2025-10-03T14:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.230976 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.231006 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.231015 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.231028 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.231036 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:23Z","lastTransitionTime":"2025-10-03T14:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.304401 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.305457 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.307857 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.309219 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.311160 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.312548 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.313866 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.315008 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.316299 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.317894 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.319113 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.321197 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.323215 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.324793 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.326215 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.327353 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.328779 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.329418 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.330202 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.331007 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.331669 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.333241 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.333504 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.333545 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.333557 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.333574 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.333586 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:23Z","lastTransitionTime":"2025-10-03T14:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.333781 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.334949 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.335357 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.335964 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.337199 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.337720 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.338840 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.339422 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.340468 4774 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.340591 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.342486 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.343715 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.344186 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.346028 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.346950 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.348010 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.348864 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.350127 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.350871 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.351891 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.352863 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.353767 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.354545 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.355365 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.356234 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.357321 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.358091 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.359252 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.359811 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.360494 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.361125 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.361643 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.435503 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.435561 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.435581 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.435600 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.435614 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:23Z","lastTransitionTime":"2025-10-03T14:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.440283 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"f5c9da0588612c165bbbf8205708cbd75e751e0dc904a5c77f07d517101aea39"} Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.441715 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e"} Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.441764 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"7117e8563773b705ff9ad87fd9b0eeb05b05b3aebcbcb6b7eb2deefe1b8cbf3b"} Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.443444 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.447150 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290"} Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.447188 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f"} Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.447201 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4300fc330508eef2e7d6bf4d39b07b0faabf46456f8d4fd9914c0f7bffd0d8b0"} Oct 03 14:43:23 crc kubenswrapper[4774]: E1003 14:43:23.451555 4774 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.451785 4774 scope.go:117] "RemoveContainer" containerID="91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa" Oct 03 14:43:23 crc kubenswrapper[4774]: E1003 14:43:23.451973 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.459668 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2975bb7e22ba015f554a696e0f604a9feb29085aac4af67c2727fc040373f54e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:08Z\\\",\\\"message\\\":\\\"W1003 14:43:06.853820 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 14:43:06.854156 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759502586 cert, and key in /tmp/serving-cert-1672917869/serving-signer.crt, /tmp/serving-cert-1672917869/serving-signer.key\\\\nI1003 14:43:07.784108 1 observer_polling.go:159] Starting file observer\\\\nW1003 14:43:07.786958 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 14:43:07.787112 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:07.787824 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1672917869/tls.crt::/tmp/serving-cert-1672917869/tls.key\\\\\\\"\\\\nF1003 14:43:08.655863 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.473185 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.486361 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.501179 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.516152 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.527968 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.537727 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.537767 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.537777 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.537790 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.537799 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:23Z","lastTransitionTime":"2025-10-03T14:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.541645 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.553134 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.564735 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.579916 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.590393 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.605819 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.619482 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.631438 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.639965 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.640002 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.640014 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.640034 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.640047 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:23Z","lastTransitionTime":"2025-10-03T14:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.730721 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:43:23 crc kubenswrapper[4774]: E1003 14:43:23.730904 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:43:25.730880781 +0000 UTC m=+28.320084233 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.742575 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.742658 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.742672 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.742694 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.742705 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:23Z","lastTransitionTime":"2025-10-03T14:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.831466 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.831696 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:43:23 crc kubenswrapper[4774]: E1003 14:43:23.831643 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 14:43:23 crc kubenswrapper[4774]: E1003 14:43:23.831750 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 14:43:23 crc kubenswrapper[4774]: E1003 14:43:23.831765 4774 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:43:23 crc kubenswrapper[4774]: E1003 14:43:23.831819 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 14:43:25.831800466 +0000 UTC m=+28.421003918 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:43:23 crc kubenswrapper[4774]: E1003 14:43:23.831984 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 14:43:23 crc kubenswrapper[4774]: E1003 14:43:23.832031 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 14:43:23 crc kubenswrapper[4774]: E1003 14:43:23.832049 4774 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:43:23 crc kubenswrapper[4774]: E1003 14:43:23.832114 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 14:43:25.832089693 +0000 UTC m=+28.421293145 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.832153 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.832183 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:43:23 crc kubenswrapper[4774]: E1003 14:43:23.832235 4774 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 14:43:23 crc kubenswrapper[4774]: E1003 14:43:23.832260 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 14:43:25.832252957 +0000 UTC m=+28.421456409 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 14:43:23 crc kubenswrapper[4774]: E1003 14:43:23.832301 4774 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 14:43:23 crc kubenswrapper[4774]: E1003 14:43:23.832320 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 14:43:25.832313279 +0000 UTC m=+28.421516731 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.845177 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.845230 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.845244 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.845262 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.845273 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:23Z","lastTransitionTime":"2025-10-03T14:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.947619 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.947670 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.947684 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.947707 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:23 crc kubenswrapper[4774]: I1003 14:43:23.947719 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:23Z","lastTransitionTime":"2025-10-03T14:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.050307 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.050349 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.050361 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.050393 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.050406 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:24Z","lastTransitionTime":"2025-10-03T14:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.152648 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.152688 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.152697 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.152731 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.152741 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:24Z","lastTransitionTime":"2025-10-03T14:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.255094 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.255143 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.255157 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.255175 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.255187 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:24Z","lastTransitionTime":"2025-10-03T14:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.298321 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.298406 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.298443 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:43:24 crc kubenswrapper[4774]: E1003 14:43:24.298497 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:43:24 crc kubenswrapper[4774]: E1003 14:43:24.298542 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:43:24 crc kubenswrapper[4774]: E1003 14:43:24.298590 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.356737 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.356787 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.356802 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.356821 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.356867 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:24Z","lastTransitionTime":"2025-10-03T14:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.449622 4774 scope.go:117] "RemoveContainer" containerID="91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa" Oct 03 14:43:24 crc kubenswrapper[4774]: E1003 14:43:24.449801 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.458737 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.458770 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.458779 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.458795 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.458805 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:24Z","lastTransitionTime":"2025-10-03T14:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.561696 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.561750 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.561766 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.561787 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.561803 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:24Z","lastTransitionTime":"2025-10-03T14:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.632644 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.645021 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.645698 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:24Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.646543 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.658149 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:24Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.663659 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.663694 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.663704 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.663719 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.663729 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:24Z","lastTransitionTime":"2025-10-03T14:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.669386 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:24Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.680517 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:24Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.694260 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:24Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.719786 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:24Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.732884 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:24Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.755690 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:24Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.766970 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.767026 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.767038 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.767064 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.767076 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:24Z","lastTransitionTime":"2025-10-03T14:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.771633 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:24Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.784653 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:24Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.798054 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:24Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.810722 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:24Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.822867 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:24Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.839816 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:24Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.853338 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:24Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.868934 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.868974 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.868984 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.868998 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.869007 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:24Z","lastTransitionTime":"2025-10-03T14:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.971895 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.971946 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.971958 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.971977 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:24 crc kubenswrapper[4774]: I1003 14:43:24.971990 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:24Z","lastTransitionTime":"2025-10-03T14:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.075163 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.075547 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.075679 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.076127 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.076228 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:25Z","lastTransitionTime":"2025-10-03T14:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.179172 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.179281 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.179298 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.179321 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.179334 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:25Z","lastTransitionTime":"2025-10-03T14:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.281941 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.282232 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.282320 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.282453 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.282552 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:25Z","lastTransitionTime":"2025-10-03T14:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.385415 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.385452 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.385461 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.385476 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.385487 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:25Z","lastTransitionTime":"2025-10-03T14:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.453195 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2"} Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.488560 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.488622 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.488639 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.488659 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.488670 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:25Z","lastTransitionTime":"2025-10-03T14:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.509836 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-wspzq"] Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.510115 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wspzq" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.510846 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.511918 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.512137 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.512551 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.512553 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.531879 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.547512 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.548667 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54bhg\" (UniqueName: \"kubernetes.io/projected/0b8a9763-d221-4434-8349-cf961e825cf7-kube-api-access-54bhg\") pod \"node-ca-wspzq\" (UID: \"0b8a9763-d221-4434-8349-cf961e825cf7\") " pod="openshift-image-registry/node-ca-wspzq" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.548718 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0b8a9763-d221-4434-8349-cf961e825cf7-host\") pod \"node-ca-wspzq\" (UID: \"0b8a9763-d221-4434-8349-cf961e825cf7\") " pod="openshift-image-registry/node-ca-wspzq" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.548862 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0b8a9763-d221-4434-8349-cf961e825cf7-serviceca\") pod \"node-ca-wspzq\" (UID: \"0b8a9763-d221-4434-8349-cf961e825cf7\") " pod="openshift-image-registry/node-ca-wspzq" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.561635 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.574133 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.586562 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.591512 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.591769 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.591834 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.591914 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.591979 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:25Z","lastTransitionTime":"2025-10-03T14:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.602544 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.617310 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.649720 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54bhg\" (UniqueName: \"kubernetes.io/projected/0b8a9763-d221-4434-8349-cf961e825cf7-kube-api-access-54bhg\") pod \"node-ca-wspzq\" (UID: \"0b8a9763-d221-4434-8349-cf961e825cf7\") " pod="openshift-image-registry/node-ca-wspzq" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.649775 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0b8a9763-d221-4434-8349-cf961e825cf7-host\") pod \"node-ca-wspzq\" (UID: \"0b8a9763-d221-4434-8349-cf961e825cf7\") " pod="openshift-image-registry/node-ca-wspzq" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.649819 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0b8a9763-d221-4434-8349-cf961e825cf7-serviceca\") pod \"node-ca-wspzq\" (UID: \"0b8a9763-d221-4434-8349-cf961e825cf7\") " pod="openshift-image-registry/node-ca-wspzq" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.649909 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0b8a9763-d221-4434-8349-cf961e825cf7-host\") pod \"node-ca-wspzq\" (UID: \"0b8a9763-d221-4434-8349-cf961e825cf7\") " pod="openshift-image-registry/node-ca-wspzq" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.652471 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.653730 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0b8a9763-d221-4434-8349-cf961e825cf7-serviceca\") pod \"node-ca-wspzq\" (UID: \"0b8a9763-d221-4434-8349-cf961e825cf7\") " pod="openshift-image-registry/node-ca-wspzq" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.672072 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54bhg\" (UniqueName: \"kubernetes.io/projected/0b8a9763-d221-4434-8349-cf961e825cf7-kube-api-access-54bhg\") pod \"node-ca-wspzq\" (UID: \"0b8a9763-d221-4434-8349-cf961e825cf7\") " pod="openshift-image-registry/node-ca-wspzq" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.689732 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.694427 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.694468 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.694480 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.694497 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.694508 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:25Z","lastTransitionTime":"2025-10-03T14:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.708248 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.719520 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.739523 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.750977 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:43:25 crc kubenswrapper[4774]: E1003 14:43:25.755625 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:43:29.755193629 +0000 UTC m=+32.344397071 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.756445 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.767979 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.779849 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.795809 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.796542 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.796576 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.796596 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.796611 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.796622 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:25Z","lastTransitionTime":"2025-10-03T14:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.823023 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wspzq" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.852298 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.852343 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.852370 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.852413 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:43:25 crc kubenswrapper[4774]: E1003 14:43:25.852528 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 14:43:25 crc kubenswrapper[4774]: E1003 14:43:25.852546 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 14:43:25 crc kubenswrapper[4774]: E1003 14:43:25.852558 4774 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:43:25 crc kubenswrapper[4774]: E1003 14:43:25.852606 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 14:43:29.852589949 +0000 UTC m=+32.441793401 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:43:25 crc kubenswrapper[4774]: E1003 14:43:25.852959 4774 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 14:43:25 crc kubenswrapper[4774]: E1003 14:43:25.852999 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 14:43:29.852988989 +0000 UTC m=+32.442192441 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 14:43:25 crc kubenswrapper[4774]: E1003 14:43:25.853056 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 14:43:25 crc kubenswrapper[4774]: E1003 14:43:25.853069 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 14:43:25 crc kubenswrapper[4774]: E1003 14:43:25.853080 4774 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:43:25 crc kubenswrapper[4774]: E1003 14:43:25.853107 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 14:43:29.853097902 +0000 UTC m=+32.442301354 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:43:25 crc kubenswrapper[4774]: E1003 14:43:25.853158 4774 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 14:43:25 crc kubenswrapper[4774]: E1003 14:43:25.853187 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 14:43:29.853177284 +0000 UTC m=+32.442380736 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.898685 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.898720 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.898735 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.898751 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:25 crc kubenswrapper[4774]: I1003 14:43:25.898762 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:25Z","lastTransitionTime":"2025-10-03T14:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.000968 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.001301 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.001312 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.001330 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.001338 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:26Z","lastTransitionTime":"2025-10-03T14:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.017924 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-jk5hb"] Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.018212 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.020613 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.021713 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.022012 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.022667 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.023164 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-s6v5z"] Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.023564 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.025019 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-5bs7x"] Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.025655 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.031792 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.031937 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-rsftk"] Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.031996 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.032147 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.032240 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rsftk" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.034775 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.035415 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.039642 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.040092 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.040342 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.041166 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.041654 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.041873 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.050108 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.061126 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.084656 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.103499 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.103528 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.103539 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.103555 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.103567 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:26Z","lastTransitionTime":"2025-10-03T14:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.120510 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.155086 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ca37ac4b-f421-4198-a179-12901d36f0f5-mcd-auth-proxy-config\") pod \"machine-config-daemon-s6v5z\" (UID: \"ca37ac4b-f421-4198-a179-12901d36f0f5\") " pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.155131 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-os-release\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.155151 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-multus-conf-dir\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.155210 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5bs7x\" (UID: \"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\") " pod="openshift-multus/multus-additional-cni-plugins-5bs7x" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.155266 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-host-run-k8s-cni-cncf-io\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.155289 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e-system-cni-dir\") pod \"multus-additional-cni-plugins-5bs7x\" (UID: \"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\") " pod="openshift-multus/multus-additional-cni-plugins-5bs7x" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.155335 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/710a59a2-4b99-43b8-89b7-a7a1c8723d38-hosts-file\") pod \"node-resolver-rsftk\" (UID: \"710a59a2-4b99-43b8-89b7-a7a1c8723d38\") " pod="openshift-dns/node-resolver-rsftk" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.155357 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-etc-kubernetes\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.155426 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpr59\" (UniqueName: \"kubernetes.io/projected/ca37ac4b-f421-4198-a179-12901d36f0f5-kube-api-access-wpr59\") pod \"machine-config-daemon-s6v5z\" (UID: \"ca37ac4b-f421-4198-a179-12901d36f0f5\") " pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.155466 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-host-var-lib-cni-bin\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.155488 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-hostroot\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.155507 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbvhk\" (UniqueName: \"kubernetes.io/projected/710a59a2-4b99-43b8-89b7-a7a1c8723d38-kube-api-access-cbvhk\") pod \"node-resolver-rsftk\" (UID: \"710a59a2-4b99-43b8-89b7-a7a1c8723d38\") " pod="openshift-dns/node-resolver-rsftk" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.155568 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfn4b\" (UniqueName: \"kubernetes.io/projected/cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e-kube-api-access-sfn4b\") pod \"multus-additional-cni-plugins-5bs7x\" (UID: \"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\") " pod="openshift-multus/multus-additional-cni-plugins-5bs7x" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.155617 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ca37ac4b-f421-4198-a179-12901d36f0f5-rootfs\") pod \"machine-config-daemon-s6v5z\" (UID: \"ca37ac4b-f421-4198-a179-12901d36f0f5\") " pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.155647 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-host-var-lib-kubelet\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.155694 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e-os-release\") pod \"multus-additional-cni-plugins-5bs7x\" (UID: \"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\") " pod="openshift-multus/multus-additional-cni-plugins-5bs7x" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.155716 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-multus-socket-dir-parent\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.155767 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-system-cni-dir\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.155785 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6dfr\" (UniqueName: \"kubernetes.io/projected/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-kube-api-access-p6dfr\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.155850 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-host-run-multus-certs\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.155898 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-multus-cni-dir\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.155919 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-cnibin\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.155957 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e-cni-binary-copy\") pod \"multus-additional-cni-plugins-5bs7x\" (UID: \"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\") " pod="openshift-multus/multus-additional-cni-plugins-5bs7x" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.155979 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-cni-binary-copy\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.155997 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-host-var-lib-cni-multus\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.156016 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-multus-daemon-config\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.156086 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca37ac4b-f421-4198-a179-12901d36f0f5-proxy-tls\") pod \"machine-config-daemon-s6v5z\" (UID: \"ca37ac4b-f421-4198-a179-12901d36f0f5\") " pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.156122 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e-cnibin\") pod \"multus-additional-cni-plugins-5bs7x\" (UID: \"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\") " pod="openshift-multus/multus-additional-cni-plugins-5bs7x" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.156140 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5bs7x\" (UID: \"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\") " pod="openshift-multus/multus-additional-cni-plugins-5bs7x" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.156156 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-host-run-netns\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.177031 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.201569 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.206145 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.206196 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.206209 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.206223 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.206233 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:26Z","lastTransitionTime":"2025-10-03T14:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.219433 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.248176 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.256743 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpr59\" (UniqueName: \"kubernetes.io/projected/ca37ac4b-f421-4198-a179-12901d36f0f5-kube-api-access-wpr59\") pod \"machine-config-daemon-s6v5z\" (UID: \"ca37ac4b-f421-4198-a179-12901d36f0f5\") " pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.256791 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-host-var-lib-cni-bin\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.256808 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-etc-kubernetes\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.256824 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-hostroot\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.256840 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfn4b\" (UniqueName: \"kubernetes.io/projected/cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e-kube-api-access-sfn4b\") pod \"multus-additional-cni-plugins-5bs7x\" (UID: \"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\") " pod="openshift-multus/multus-additional-cni-plugins-5bs7x" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.256860 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbvhk\" (UniqueName: \"kubernetes.io/projected/710a59a2-4b99-43b8-89b7-a7a1c8723d38-kube-api-access-cbvhk\") pod \"node-resolver-rsftk\" (UID: \"710a59a2-4b99-43b8-89b7-a7a1c8723d38\") " pod="openshift-dns/node-resolver-rsftk" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.256885 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ca37ac4b-f421-4198-a179-12901d36f0f5-rootfs\") pod \"machine-config-daemon-s6v5z\" (UID: \"ca37ac4b-f421-4198-a179-12901d36f0f5\") " pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.256903 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e-os-release\") pod \"multus-additional-cni-plugins-5bs7x\" (UID: \"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\") " pod="openshift-multus/multus-additional-cni-plugins-5bs7x" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.256923 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-multus-socket-dir-parent\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.256939 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-host-var-lib-kubelet\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.256954 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-system-cni-dir\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.256972 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6dfr\" (UniqueName: \"kubernetes.io/projected/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-kube-api-access-p6dfr\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.256987 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-multus-cni-dir\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.257007 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-cnibin\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.257021 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-host-run-multus-certs\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.257044 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e-cni-binary-copy\") pod \"multus-additional-cni-plugins-5bs7x\" (UID: \"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\") " pod="openshift-multus/multus-additional-cni-plugins-5bs7x" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.257066 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-host-var-lib-cni-multus\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.257080 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-multus-daemon-config\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.257107 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca37ac4b-f421-4198-a179-12901d36f0f5-proxy-tls\") pod \"machine-config-daemon-s6v5z\" (UID: \"ca37ac4b-f421-4198-a179-12901d36f0f5\") " pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.257126 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-cni-binary-copy\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.257144 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5bs7x\" (UID: \"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\") " pod="openshift-multus/multus-additional-cni-plugins-5bs7x" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.257159 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-host-run-netns\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.257188 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e-cnibin\") pod \"multus-additional-cni-plugins-5bs7x\" (UID: \"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\") " pod="openshift-multus/multus-additional-cni-plugins-5bs7x" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.257205 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-os-release\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.257219 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-multus-conf-dir\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.257248 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ca37ac4b-f421-4198-a179-12901d36f0f5-mcd-auth-proxy-config\") pod \"machine-config-daemon-s6v5z\" (UID: \"ca37ac4b-f421-4198-a179-12901d36f0f5\") " pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.257265 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-host-run-k8s-cni-cncf-io\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.257281 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e-system-cni-dir\") pod \"multus-additional-cni-plugins-5bs7x\" (UID: \"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\") " pod="openshift-multus/multus-additional-cni-plugins-5bs7x" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.257297 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5bs7x\" (UID: \"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\") " pod="openshift-multus/multus-additional-cni-plugins-5bs7x" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.257312 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/710a59a2-4b99-43b8-89b7-a7a1c8723d38-hosts-file\") pod \"node-resolver-rsftk\" (UID: \"710a59a2-4b99-43b8-89b7-a7a1c8723d38\") " pod="openshift-dns/node-resolver-rsftk" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.257401 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/710a59a2-4b99-43b8-89b7-a7a1c8723d38-hosts-file\") pod \"node-resolver-rsftk\" (UID: \"710a59a2-4b99-43b8-89b7-a7a1c8723d38\") " pod="openshift-dns/node-resolver-rsftk" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.257620 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-host-var-lib-cni-bin\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.257644 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-etc-kubernetes\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.257663 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-hostroot\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.257882 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ca37ac4b-f421-4198-a179-12901d36f0f5-rootfs\") pod \"machine-config-daemon-s6v5z\" (UID: \"ca37ac4b-f421-4198-a179-12901d36f0f5\") " pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.257929 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e-os-release\") pod \"multus-additional-cni-plugins-5bs7x\" (UID: \"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\") " pod="openshift-multus/multus-additional-cni-plugins-5bs7x" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.258075 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-multus-socket-dir-parent\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.258104 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-host-var-lib-kubelet\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.258135 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-system-cni-dir\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.258361 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-multus-cni-dir\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.258408 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-cnibin\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.259186 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-multus-daemon-config\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.258454 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e-cnibin\") pod \"multus-additional-cni-plugins-5bs7x\" (UID: \"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\") " pod="openshift-multus/multus-additional-cni-plugins-5bs7x" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.259247 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-host-run-multus-certs\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.259737 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e-cni-binary-copy\") pod \"multus-additional-cni-plugins-5bs7x\" (UID: \"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\") " pod="openshift-multus/multus-additional-cni-plugins-5bs7x" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.259797 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-os-release\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.259828 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-host-run-netns\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.260231 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-cni-binary-copy\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.261555 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-host-run-k8s-cni-cncf-io\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.261676 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-multus-conf-dir\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.261902 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e-system-cni-dir\") pod \"multus-additional-cni-plugins-5bs7x\" (UID: \"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\") " pod="openshift-multus/multus-additional-cni-plugins-5bs7x" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.262262 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-host-var-lib-cni-multus\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.262262 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5bs7x\" (UID: \"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\") " pod="openshift-multus/multus-additional-cni-plugins-5bs7x" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.262587 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ca37ac4b-f421-4198-a179-12901d36f0f5-mcd-auth-proxy-config\") pod \"machine-config-daemon-s6v5z\" (UID: \"ca37ac4b-f421-4198-a179-12901d36f0f5\") " pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.263209 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5bs7x\" (UID: \"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\") " pod="openshift-multus/multus-additional-cni-plugins-5bs7x" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.266090 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca37ac4b-f421-4198-a179-12901d36f0f5-proxy-tls\") pod \"machine-config-daemon-s6v5z\" (UID: \"ca37ac4b-f421-4198-a179-12901d36f0f5\") " pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.270942 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.278710 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfn4b\" (UniqueName: \"kubernetes.io/projected/cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e-kube-api-access-sfn4b\") pod \"multus-additional-cni-plugins-5bs7x\" (UID: \"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\") " pod="openshift-multus/multus-additional-cni-plugins-5bs7x" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.278746 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpr59\" (UniqueName: \"kubernetes.io/projected/ca37ac4b-f421-4198-a179-12901d36f0f5-kube-api-access-wpr59\") pod \"machine-config-daemon-s6v5z\" (UID: \"ca37ac4b-f421-4198-a179-12901d36f0f5\") " pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.279541 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6dfr\" (UniqueName: \"kubernetes.io/projected/4f2cc8dc-61c3-4a0b-8da3-b899094eaa53-kube-api-access-p6dfr\") pod \"multus-jk5hb\" (UID: \"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\") " pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.282897 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbvhk\" (UniqueName: \"kubernetes.io/projected/710a59a2-4b99-43b8-89b7-a7a1c8723d38-kube-api-access-cbvhk\") pod \"node-resolver-rsftk\" (UID: \"710a59a2-4b99-43b8-89b7-a7a1c8723d38\") " pod="openshift-dns/node-resolver-rsftk" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.286444 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.297880 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.298970 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:43:26 crc kubenswrapper[4774]: E1003 14:43:26.299083 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.299126 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.299118 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:43:26 crc kubenswrapper[4774]: E1003 14:43:26.299234 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:43:26 crc kubenswrapper[4774]: E1003 14:43:26.299317 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.308242 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.308280 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.308290 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.308307 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.308317 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:26Z","lastTransitionTime":"2025-10-03T14:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.309313 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.328850 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jk5hb" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.331271 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: W1003 14:43:26.339344 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f2cc8dc_61c3_4a0b_8da3_b899094eaa53.slice/crio-b6b3a0a8125cfd22a010ae3a5712a911824c5d8fd4ea7d912e64c2038430dcc7 WatchSource:0}: Error finding container b6b3a0a8125cfd22a010ae3a5712a911824c5d8fd4ea7d912e64c2038430dcc7: Status 404 returned error can't find the container with id b6b3a0a8125cfd22a010ae3a5712a911824c5d8fd4ea7d912e64c2038430dcc7 Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.339847 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.344923 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.348011 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.354040 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rsftk" Oct 03 14:43:26 crc kubenswrapper[4774]: W1003 14:43:26.354338 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca37ac4b_f421_4198_a179_12901d36f0f5.slice/crio-27ed3c4a9268f4ead8202ce130750ee244ac28a3c1f33b548fb4f14584e880c3 WatchSource:0}: Error finding container 27ed3c4a9268f4ead8202ce130750ee244ac28a3c1f33b548fb4f14584e880c3: Status 404 returned error can't find the container with id 27ed3c4a9268f4ead8202ce130750ee244ac28a3c1f33b548fb4f14584e880c3 Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.361197 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: W1003 14:43:26.369892 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd01f0ff_e4fd_4c0a_8ec1_c0e19eabad6e.slice/crio-0079f900a18da20599a06afc4f8f140ea544b6eb960032e504c538f3ee7654cf WatchSource:0}: Error finding container 0079f900a18da20599a06afc4f8f140ea544b6eb960032e504c538f3ee7654cf: Status 404 returned error can't find the container with id 0079f900a18da20599a06afc4f8f140ea544b6eb960032e504c538f3ee7654cf Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.378176 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.392485 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.402678 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.412229 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.412259 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.412270 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.412287 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.412300 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:26Z","lastTransitionTime":"2025-10-03T14:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.424587 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.439254 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.450488 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.457026 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rsftk" event={"ID":"710a59a2-4b99-43b8-89b7-a7a1c8723d38","Type":"ContainerStarted","Data":"f92aaa1e00b135e3869d9be51136535e024aad0a4d12b21ec37321bb0848b512"} Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.459565 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jk5hb" event={"ID":"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53","Type":"ContainerStarted","Data":"b6b3a0a8125cfd22a010ae3a5712a911824c5d8fd4ea7d912e64c2038430dcc7"} Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.462249 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" event={"ID":"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e","Type":"ContainerStarted","Data":"0079f900a18da20599a06afc4f8f140ea544b6eb960032e504c538f3ee7654cf"} Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.463462 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" event={"ID":"ca37ac4b-f421-4198-a179-12901d36f0f5","Type":"ContainerStarted","Data":"27ed3c4a9268f4ead8202ce130750ee244ac28a3c1f33b548fb4f14584e880c3"} Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.465307 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wspzq" event={"ID":"0b8a9763-d221-4434-8349-cf961e825cf7","Type":"ContainerStarted","Data":"98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d"} Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.465353 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wspzq" event={"ID":"0b8a9763-d221-4434-8349-cf961e825cf7","Type":"ContainerStarted","Data":"ca13df5a9ace25d0d42fb934153b79168d80c6e2a40e9cd4bf8d30356236ee8e"} Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.467657 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.479674 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.493785 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.493728 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.500053 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.500737 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.506746 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.514470 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.514514 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.514525 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.514540 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.514549 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:26Z","lastTransitionTime":"2025-10-03T14:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.517584 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.536990 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.553211 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.569651 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.584546 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.617161 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.617197 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.617207 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.617222 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.617233 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:26Z","lastTransitionTime":"2025-10-03T14:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.623305 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.635325 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.644897 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.661303 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.677428 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.691824 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.707575 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.719401 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.719436 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.719447 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.719463 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.719475 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:26Z","lastTransitionTime":"2025-10-03T14:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.731269 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.743997 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.756199 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.766620 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.768032 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jzv75"] Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.768780 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.770964 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.771119 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.771166 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.771325 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.771464 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.772534 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.772615 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.778986 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.791292 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.803586 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.814176 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.822058 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.822094 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.822103 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.822117 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.822130 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:26Z","lastTransitionTime":"2025-10-03T14:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.825516 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.844238 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.857749 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.860879 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-systemd-units\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.860922 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-run-ovn-kubernetes\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.860958 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-cni-bin\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.860981 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp65q\" (UniqueName: \"kubernetes.io/projected/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-kube-api-access-gp65q\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.861001 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-run-ovn\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.861038 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-node-log\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.861059 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-cni-netd\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.861093 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-var-lib-openvswitch\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.861136 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-ovnkube-script-lib\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.861172 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-kubelet\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.861193 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-run-openvswitch\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.861214 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-ovnkube-config\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.861260 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-log-socket\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.861287 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.861320 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-run-netns\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.861340 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-etc-openvswitch\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.861406 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-env-overrides\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.861431 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-ovn-node-metrics-cert\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.861455 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-slash\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.861475 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-run-systemd\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.870697 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.888247 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.899393 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.909099 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.920106 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.924798 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.924852 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.924867 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.924883 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.924895 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:26Z","lastTransitionTime":"2025-10-03T14:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.959913 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.962317 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-run-systemd\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.962353 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-slash\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.962405 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-systemd-units\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.962428 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-run-ovn-kubernetes\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.962465 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-cni-bin\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.962475 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-run-systemd\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.962511 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-systemd-units\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.962543 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-run-ovn-kubernetes\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.962567 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-cni-bin\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.962510 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-slash\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.962486 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gp65q\" (UniqueName: \"kubernetes.io/projected/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-kube-api-access-gp65q\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.962643 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-run-ovn\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.962669 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-node-log\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.962684 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-cni-netd\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.962709 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-ovnkube-script-lib\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.962728 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-var-lib-openvswitch\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.962742 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-kubelet\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.962748 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-node-log\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.962756 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-run-openvswitch\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.962773 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-run-openvswitch\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.962797 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-ovnkube-config\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.962810 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-cni-netd\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.962835 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-log-socket\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.962875 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.962899 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-run-netns\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.962919 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-etc-openvswitch\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.962949 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-env-overrides\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.962969 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-ovn-node-metrics-cert\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.963158 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-var-lib-openvswitch\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.963176 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.963218 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-run-netns\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.963228 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-etc-openvswitch\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.963220 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-kubelet\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.963259 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-run-ovn\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.963287 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-log-socket\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.963499 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-ovnkube-script-lib\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.963702 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-env-overrides\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.963833 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-ovnkube-config\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:26 crc kubenswrapper[4774]: I1003 14:43:26.966226 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-ovn-node-metrics-cert\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.007425 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp65q\" (UniqueName: \"kubernetes.io/projected/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-kube-api-access-gp65q\") pod \"ovnkube-node-jzv75\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.020034 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.026765 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.026807 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.026817 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.026832 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.026843 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:27Z","lastTransitionTime":"2025-10-03T14:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.059692 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.081343 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:27 crc kubenswrapper[4774]: W1003 14:43:27.091920 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01bef0c3_23b3_4f49_8c33_3f2ec7503b12.slice/crio-c515b30d898275d86790d1d37cb131444b0cb4a19d897d79dccb46b7ed13ae4b WatchSource:0}: Error finding container c515b30d898275d86790d1d37cb131444b0cb4a19d897d79dccb46b7ed13ae4b: Status 404 returned error can't find the container with id c515b30d898275d86790d1d37cb131444b0cb4a19d897d79dccb46b7ed13ae4b Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.107992 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.128869 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.128905 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.128915 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.128932 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.128941 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:27Z","lastTransitionTime":"2025-10-03T14:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.145719 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.178689 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.223344 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.231357 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.231437 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.231453 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.231472 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.231814 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:27Z","lastTransitionTime":"2025-10-03T14:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.260386 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.299023 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.333549 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.333580 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.333590 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.333603 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.333612 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:27Z","lastTransitionTime":"2025-10-03T14:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.347821 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.382771 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.423584 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.435003 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.435263 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.435275 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.435290 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.435299 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:27Z","lastTransitionTime":"2025-10-03T14:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.469885 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" event={"ID":"ca37ac4b-f421-4198-a179-12901d36f0f5","Type":"ContainerStarted","Data":"6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a"} Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.469932 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" event={"ID":"ca37ac4b-f421-4198-a179-12901d36f0f5","Type":"ContainerStarted","Data":"bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055"} Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.471403 4774 generic.go:334] "Generic (PLEG): container finished" podID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerID="583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b" exitCode=0 Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.471444 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" event={"ID":"01bef0c3-23b3-4f49-8c33-3f2ec7503b12","Type":"ContainerDied","Data":"583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b"} Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.471517 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" event={"ID":"01bef0c3-23b3-4f49-8c33-3f2ec7503b12","Type":"ContainerStarted","Data":"c515b30d898275d86790d1d37cb131444b0cb4a19d897d79dccb46b7ed13ae4b"} Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.472954 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rsftk" event={"ID":"710a59a2-4b99-43b8-89b7-a7a1c8723d38","Type":"ContainerStarted","Data":"7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986"} Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.474546 4774 generic.go:334] "Generic (PLEG): container finished" podID="cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e" containerID="3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c" exitCode=0 Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.474611 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" event={"ID":"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e","Type":"ContainerDied","Data":"3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c"} Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.475754 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jk5hb" event={"ID":"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53","Type":"ContainerStarted","Data":"67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b"} Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.489772 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.501194 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.536894 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.536936 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.536955 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.536976 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.536991 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:27Z","lastTransitionTime":"2025-10-03T14:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.543041 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.580861 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.621482 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.639191 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.639229 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.639237 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.639252 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.639264 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:27Z","lastTransitionTime":"2025-10-03T14:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.662565 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.698893 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.741165 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.741223 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.741245 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.741268 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.741284 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:27Z","lastTransitionTime":"2025-10-03T14:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.751700 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.784119 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.819573 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.844338 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.844406 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.844418 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.844435 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.844448 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:27Z","lastTransitionTime":"2025-10-03T14:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.862311 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.903269 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.942049 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.946701 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.946752 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.946764 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.946783 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.946795 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:27Z","lastTransitionTime":"2025-10-03T14:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:27 crc kubenswrapper[4774]: I1003 14:43:27.983022 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.026419 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:28Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.049904 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.049933 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.049941 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.049954 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.049965 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:28Z","lastTransitionTime":"2025-10-03T14:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.065460 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:28Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.101657 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:28Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.147907 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:28Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.152560 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.152611 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.152623 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.152640 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.152652 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:28Z","lastTransitionTime":"2025-10-03T14:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.181015 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:28Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.221808 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:28Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.255302 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.255341 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.255352 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.255388 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.255404 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:28Z","lastTransitionTime":"2025-10-03T14:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.261970 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:28Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.298712 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.298772 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.298772 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:43:28 crc kubenswrapper[4774]: E1003 14:43:28.298848 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:43:28 crc kubenswrapper[4774]: E1003 14:43:28.298994 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:43:28 crc kubenswrapper[4774]: E1003 14:43:28.299070 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.299845 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:28Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.342305 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:28Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.365100 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.365158 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.365176 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.365200 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.365219 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:28Z","lastTransitionTime":"2025-10-03T14:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.382975 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:28Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.425741 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:28Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.461079 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:28Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.467819 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.467852 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.467860 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.467874 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.467883 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:28Z","lastTransitionTime":"2025-10-03T14:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.481364 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" event={"ID":"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e","Type":"ContainerStarted","Data":"6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16"} Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.500114 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:28Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.583073 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:28Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.584906 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.584938 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.584947 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.584963 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.584974 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:28Z","lastTransitionTime":"2025-10-03T14:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.597738 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:28Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.618424 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:28Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.659243 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:28Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.688111 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.688140 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.688149 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.688162 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.688171 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:28Z","lastTransitionTime":"2025-10-03T14:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.699807 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:28Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.739199 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:28Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.778740 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:28Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.789352 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.789424 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.789437 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.789451 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.789461 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:28Z","lastTransitionTime":"2025-10-03T14:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.848436 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:28Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.859447 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:28Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.892557 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.892593 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.892606 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.892622 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.892634 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:28Z","lastTransitionTime":"2025-10-03T14:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.905581 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:28Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.942246 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:28Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.979590 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:28Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.994733 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.994814 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.994827 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.994845 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:28 crc kubenswrapper[4774]: I1003 14:43:28.994858 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:28Z","lastTransitionTime":"2025-10-03T14:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.019720 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.062041 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.098042 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.098090 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.098105 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.098126 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.098141 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:29Z","lastTransitionTime":"2025-10-03T14:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.102252 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.147027 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.179772 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.199949 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.199981 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.199989 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.200003 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.200014 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:29Z","lastTransitionTime":"2025-10-03T14:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.220522 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.302090 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.302130 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.302143 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.302164 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.302176 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:29Z","lastTransitionTime":"2025-10-03T14:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.310520 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.323985 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.344153 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.380501 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.404268 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.404306 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.404315 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.404329 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.404339 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:29Z","lastTransitionTime":"2025-10-03T14:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.420463 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.459034 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.486296 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" event={"ID":"01bef0c3-23b3-4f49-8c33-3f2ec7503b12","Type":"ContainerStarted","Data":"aaa74c70ca8d93d672246f3451771bffa1e304c88a453cbbe3a305a304954fe7"} Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.486361 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" event={"ID":"01bef0c3-23b3-4f49-8c33-3f2ec7503b12","Type":"ContainerStarted","Data":"bc0d624981ca616d27057de09511f8a17b643686871e2e77c15bd1c65fba9c95"} Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.488098 4774 generic.go:334] "Generic (PLEG): container finished" podID="cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e" containerID="6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16" exitCode=0 Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.488166 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" event={"ID":"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e","Type":"ContainerDied","Data":"6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16"} Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.500312 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.508178 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.508206 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.508215 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.508228 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.508238 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:29Z","lastTransitionTime":"2025-10-03T14:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.545901 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.579523 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.617188 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.617220 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.617229 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.617243 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.617252 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:29Z","lastTransitionTime":"2025-10-03T14:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.628461 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.661831 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.698603 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.720332 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.720367 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.720391 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.720407 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.720418 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:29Z","lastTransitionTime":"2025-10-03T14:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.740511 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.785476 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.797131 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:43:29 crc kubenswrapper[4774]: E1003 14:43:29.797292 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:43:37.797277437 +0000 UTC m=+40.386480889 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.819689 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.824167 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.824200 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.824212 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.824228 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.824240 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:29Z","lastTransitionTime":"2025-10-03T14:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.865186 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.898258 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.898319 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.898345 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.898385 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:43:29 crc kubenswrapper[4774]: E1003 14:43:29.898539 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 14:43:29 crc kubenswrapper[4774]: E1003 14:43:29.898560 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 14:43:29 crc kubenswrapper[4774]: E1003 14:43:29.898574 4774 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:43:29 crc kubenswrapper[4774]: E1003 14:43:29.898608 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 14:43:29 crc kubenswrapper[4774]: E1003 14:43:29.898637 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 14:43:29 crc kubenswrapper[4774]: E1003 14:43:29.898649 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 14:43:37.898628244 +0000 UTC m=+40.487831706 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:43:29 crc kubenswrapper[4774]: E1003 14:43:29.898650 4774 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:43:29 crc kubenswrapper[4774]: E1003 14:43:29.898661 4774 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 14:43:29 crc kubenswrapper[4774]: E1003 14:43:29.898674 4774 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 14:43:29 crc kubenswrapper[4774]: E1003 14:43:29.898690 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 14:43:37.898680605 +0000 UTC m=+40.487884057 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:43:29 crc kubenswrapper[4774]: E1003 14:43:29.898866 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 14:43:37.898833139 +0000 UTC m=+40.488036761 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 14:43:29 crc kubenswrapper[4774]: E1003 14:43:29.898889 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 14:43:37.89887738 +0000 UTC m=+40.488081022 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.903718 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.927839 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.927871 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.927882 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.927898 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.927909 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:29Z","lastTransitionTime":"2025-10-03T14:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.938549 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:29 crc kubenswrapper[4774]: I1003 14:43:29.990148 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.026574 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.030349 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.030394 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.030405 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.030423 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.030435 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:30Z","lastTransitionTime":"2025-10-03T14:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.061285 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.102764 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.133252 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.133311 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.133324 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.133342 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.133354 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:30Z","lastTransitionTime":"2025-10-03T14:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.142026 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.187833 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.220708 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.235283 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.235312 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.235320 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.235334 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.235343 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:30Z","lastTransitionTime":"2025-10-03T14:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.264481 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.299413 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.299444 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:43:30 crc kubenswrapper[4774]: E1003 14:43:30.299595 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.299628 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:43:30 crc kubenswrapper[4774]: E1003 14:43:30.299773 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:43:30 crc kubenswrapper[4774]: E1003 14:43:30.299967 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.300813 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.337536 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.337630 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.337655 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.337687 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.337709 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:30Z","lastTransitionTime":"2025-10-03T14:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.341677 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.381123 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.426315 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.440120 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.440348 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.440422 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.440447 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.440461 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:30Z","lastTransitionTime":"2025-10-03T14:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.497267 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" event={"ID":"01bef0c3-23b3-4f49-8c33-3f2ec7503b12","Type":"ContainerStarted","Data":"b575a360d2ee43bc2629635a4ffda77a1b0e1a171bbd99b76b725784be537e9c"} Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.497344 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" event={"ID":"01bef0c3-23b3-4f49-8c33-3f2ec7503b12","Type":"ContainerStarted","Data":"8ae8d985dec947c332efa41abbc5c30a978f06942cc62a86ab45e90fc2e5d763"} Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.497402 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" event={"ID":"01bef0c3-23b3-4f49-8c33-3f2ec7503b12","Type":"ContainerStarted","Data":"b330d308f14c0b3b3630e89fe3e43728e17dbc3fe78cf9cae365a3ab0474c600"} Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.497428 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" event={"ID":"01bef0c3-23b3-4f49-8c33-3f2ec7503b12","Type":"ContainerStarted","Data":"fdd06fa447f9a388f7cabafaff05c8d6dc9aedc161d2097f81855318c8b7712e"} Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.500304 4774 generic.go:334] "Generic (PLEG): container finished" podID="cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e" containerID="5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7" exitCode=0 Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.500362 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" event={"ID":"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e","Type":"ContainerDied","Data":"5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7"} Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.522843 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.543232 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.543276 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.543287 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.543304 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.543315 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:30Z","lastTransitionTime":"2025-10-03T14:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.546795 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.571185 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.586344 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.620396 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.646229 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.646272 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.646284 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.646301 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.646315 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:30Z","lastTransitionTime":"2025-10-03T14:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.659719 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.700746 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.742670 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.748274 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.748312 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.748322 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.748335 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.748347 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:30Z","lastTransitionTime":"2025-10-03T14:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.780744 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.823267 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.850686 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.850960 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.851122 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.851286 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.851426 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:30Z","lastTransitionTime":"2025-10-03T14:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.862142 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.902885 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.945998 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.953770 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.953802 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.953815 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.953833 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.953844 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:30Z","lastTransitionTime":"2025-10-03T14:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:30 crc kubenswrapper[4774]: I1003 14:43:30.979859 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.044238 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.055968 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.056012 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.056024 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.056044 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.056056 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:31Z","lastTransitionTime":"2025-10-03T14:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.158162 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.158201 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.158212 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.158226 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.158235 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:31Z","lastTransitionTime":"2025-10-03T14:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.270896 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.270950 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.270969 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.270993 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.271010 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:31Z","lastTransitionTime":"2025-10-03T14:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.373881 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.373940 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.373956 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.373977 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.373993 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:31Z","lastTransitionTime":"2025-10-03T14:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.476638 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.476681 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.476720 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.476740 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.476749 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:31Z","lastTransitionTime":"2025-10-03T14:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.505561 4774 generic.go:334] "Generic (PLEG): container finished" podID="cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e" containerID="61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc" exitCode=0 Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.505607 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" event={"ID":"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e","Type":"ContainerDied","Data":"61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc"} Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.521078 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.540236 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.562626 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.572804 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.581391 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.581431 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.581443 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.581459 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.581472 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:31Z","lastTransitionTime":"2025-10-03T14:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.585538 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.596272 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.607444 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.617530 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.629352 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.639825 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.652836 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.664199 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.682301 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.683425 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.683459 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.683470 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.683485 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.683497 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:31Z","lastTransitionTime":"2025-10-03T14:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.694887 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.703490 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.785616 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.785653 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.785665 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.785681 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.785691 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:31Z","lastTransitionTime":"2025-10-03T14:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.887495 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.887546 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.887565 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.887583 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.887594 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:31Z","lastTransitionTime":"2025-10-03T14:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.989992 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.990033 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.990045 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.990060 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:31 crc kubenswrapper[4774]: I1003 14:43:31.990069 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:31Z","lastTransitionTime":"2025-10-03T14:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.093070 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.093113 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.093129 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.093145 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.093155 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:32Z","lastTransitionTime":"2025-10-03T14:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.195956 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.195999 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.196010 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.196025 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.196033 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:32Z","lastTransitionTime":"2025-10-03T14:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.278106 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.278165 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.278177 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.278194 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.278205 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:32Z","lastTransitionTime":"2025-10-03T14:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.298588 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.298643 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.298648 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:43:32 crc kubenswrapper[4774]: E1003 14:43:32.298708 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:43:32 crc kubenswrapper[4774]: E1003 14:43:32.298619 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:32Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:32 crc kubenswrapper[4774]: E1003 14:43:32.298869 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:43:32 crc kubenswrapper[4774]: E1003 14:43:32.299026 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.304517 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.304593 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.304607 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.304631 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.304647 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:32Z","lastTransitionTime":"2025-10-03T14:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:32 crc kubenswrapper[4774]: E1003 14:43:32.318718 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:32Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.324359 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.324445 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.324465 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.324527 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.324543 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:32Z","lastTransitionTime":"2025-10-03T14:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:32 crc kubenswrapper[4774]: E1003 14:43:32.342482 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:32Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.346886 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.346933 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.346949 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.346971 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.346987 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:32Z","lastTransitionTime":"2025-10-03T14:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:32 crc kubenswrapper[4774]: E1003 14:43:32.361100 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:32Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.369000 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.369039 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.369052 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.369071 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.369084 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:32Z","lastTransitionTime":"2025-10-03T14:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:32 crc kubenswrapper[4774]: E1003 14:43:32.388977 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:32Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:32 crc kubenswrapper[4774]: E1003 14:43:32.389316 4774 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.390969 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.391024 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.391041 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.391064 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.391079 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:32Z","lastTransitionTime":"2025-10-03T14:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.493423 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.493704 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.493897 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.494099 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.494273 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:32Z","lastTransitionTime":"2025-10-03T14:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.514941 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" event={"ID":"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e","Type":"ContainerStarted","Data":"1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a"} Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.550062 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:32Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.565652 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:32Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.582063 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:32Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.596839 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.596879 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.596890 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.596905 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.596917 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:32Z","lastTransitionTime":"2025-10-03T14:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.599061 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:32Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.615565 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:32Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.626580 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:32Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.645987 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:32Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.664317 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:32Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.681727 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:32Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.698023 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:32Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.698880 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.698905 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.698914 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.698928 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.698937 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:32Z","lastTransitionTime":"2025-10-03T14:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.715571 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:32Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.731720 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:32Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.748447 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:32Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.761569 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:32Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.773295 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:32Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.800490 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.800527 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.800536 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.800550 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.800558 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:32Z","lastTransitionTime":"2025-10-03T14:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.902668 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.902704 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.902714 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.902727 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:32 crc kubenswrapper[4774]: I1003 14:43:32.902736 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:32Z","lastTransitionTime":"2025-10-03T14:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.005874 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.005928 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.005941 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.005962 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.005975 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:33Z","lastTransitionTime":"2025-10-03T14:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.108783 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.108835 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.108849 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.108871 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.108886 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:33Z","lastTransitionTime":"2025-10-03T14:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.212017 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.212076 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.212095 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.212123 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.212140 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:33Z","lastTransitionTime":"2025-10-03T14:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.314624 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.314664 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.314672 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.314687 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.314696 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:33Z","lastTransitionTime":"2025-10-03T14:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.417270 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.417317 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.417330 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.417350 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.417362 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:33Z","lastTransitionTime":"2025-10-03T14:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.519704 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.519766 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.519785 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.519808 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.519825 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:33Z","lastTransitionTime":"2025-10-03T14:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.523176 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" event={"ID":"01bef0c3-23b3-4f49-8c33-3f2ec7503b12","Type":"ContainerStarted","Data":"ac1b1bb4b30aa1d11558ff126ad87e4dd7b9aaf255cd4a1860aacdee30a0bb15"} Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.527151 4774 generic.go:334] "Generic (PLEG): container finished" podID="cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e" containerID="1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a" exitCode=0 Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.527202 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" event={"ID":"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e","Type":"ContainerDied","Data":"1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a"} Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.542973 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:33Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.564819 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:33Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.581960 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:33Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.598952 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:33Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.610886 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:33Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.622017 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.622050 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.622061 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.622077 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.622087 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:33Z","lastTransitionTime":"2025-10-03T14:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.625028 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:33Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.639708 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:33Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.655916 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:33Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.672698 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:33Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.686485 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:33Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.704288 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:33Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.719523 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:33Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.723972 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.724006 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.724015 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.724029 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.724041 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:33Z","lastTransitionTime":"2025-10-03T14:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.736771 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:33Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.761127 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:33Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.777778 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:33Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.826581 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.826611 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.826622 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.826637 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.826647 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:33Z","lastTransitionTime":"2025-10-03T14:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.928916 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.928954 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.928968 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.928985 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:33 crc kubenswrapper[4774]: I1003 14:43:33.928995 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:33Z","lastTransitionTime":"2025-10-03T14:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.032032 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.032292 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.032300 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.032314 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.032323 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:34Z","lastTransitionTime":"2025-10-03T14:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.145327 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.145404 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.145423 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.145445 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.145461 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:34Z","lastTransitionTime":"2025-10-03T14:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.247864 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.247922 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.247943 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.247966 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.247982 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:34Z","lastTransitionTime":"2025-10-03T14:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.298998 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:43:34 crc kubenswrapper[4774]: E1003 14:43:34.299179 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.299745 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:43:34 crc kubenswrapper[4774]: E1003 14:43:34.299853 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.299925 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:43:34 crc kubenswrapper[4774]: E1003 14:43:34.300000 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.351018 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.351061 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.351070 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.351087 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.351098 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:34Z","lastTransitionTime":"2025-10-03T14:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.454114 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.454165 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.454183 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.454205 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.454219 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:34Z","lastTransitionTime":"2025-10-03T14:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.539059 4774 generic.go:334] "Generic (PLEG): container finished" podID="cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e" containerID="ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1" exitCode=0 Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.539121 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" event={"ID":"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e","Type":"ContainerDied","Data":"ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1"} Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.555827 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:34Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.556778 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.556824 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.556842 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.556870 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.556888 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:34Z","lastTransitionTime":"2025-10-03T14:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.579335 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:34Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.604832 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:34Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.625806 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:34Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.640235 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:34Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.653843 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:34Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.658809 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.658845 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.658859 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.658875 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.658886 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:34Z","lastTransitionTime":"2025-10-03T14:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.664669 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:34Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.676888 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:34Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.686507 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:34Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.700622 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:34Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.716411 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:34Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.725936 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:34Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.746809 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:34Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.759489 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:34Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.760589 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.760621 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.760630 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.760644 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.760653 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:34Z","lastTransitionTime":"2025-10-03T14:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.771065 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:34Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.862704 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.862777 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.862790 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.862806 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.862814 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:34Z","lastTransitionTime":"2025-10-03T14:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.965621 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.965664 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.965680 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.965699 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:34 crc kubenswrapper[4774]: I1003 14:43:34.965715 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:34Z","lastTransitionTime":"2025-10-03T14:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.068842 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.068896 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.068907 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.068925 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.068937 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:35Z","lastTransitionTime":"2025-10-03T14:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.171666 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.171741 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.171770 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.171800 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.171822 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:35Z","lastTransitionTime":"2025-10-03T14:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.274237 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.274287 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.274299 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.274316 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.274330 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:35Z","lastTransitionTime":"2025-10-03T14:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.376868 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.376907 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.376918 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.376934 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.376945 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:35Z","lastTransitionTime":"2025-10-03T14:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.479409 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.479447 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.479458 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.479473 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.479485 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:35Z","lastTransitionTime":"2025-10-03T14:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.545734 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" event={"ID":"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e","Type":"ContainerStarted","Data":"b16e8c0249ee2448082160abe417f67a17fac176add76d00dd5e755c0f9a6a00"} Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.550819 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" event={"ID":"01bef0c3-23b3-4f49-8c33-3f2ec7503b12","Type":"ContainerStarted","Data":"9949e09b836433403be3f3ce9a5226b926ee28efa766add03a7d0f8cef39fb0b"} Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.551685 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.551934 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.562026 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:35Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.577232 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:35Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.583045 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.583117 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.583134 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.583163 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.583183 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:35Z","lastTransitionTime":"2025-10-03T14:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.583508 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.583946 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.591678 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:35Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.601787 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:35Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.617089 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:35Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.632280 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:35Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.646956 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16e8c0249ee2448082160abe417f67a17fac176add76d00dd5e755c0f9a6a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:35Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.657636 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:35Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.674164 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:35Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.684264 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:35Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.685924 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.685957 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.685968 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.686005 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.686017 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:35Z","lastTransitionTime":"2025-10-03T14:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.694037 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:35Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.704808 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:35Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.715510 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:35Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.740149 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:35Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.760761 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:35Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.773251 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:35Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.785012 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:35Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.789495 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.789534 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.789545 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.789567 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.789581 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:35Z","lastTransitionTime":"2025-10-03T14:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.803045 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd06fa447f9a388f7cabafaff05c8d6dc9aedc161d2097f81855318c8b7712e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b330d308f14c0b3b3630e89fe3e43728e17dbc3fe78cf9cae365a3ab0474c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b575a360d2ee43bc2629635a4ffda77a1b0e1a171bbd99b76b725784be537e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae8d985dec947c332efa41abbc5c30a978f06942cc62a86ab45e90fc2e5d763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aaa74c70ca8d93d672246f3451771bffa1e304c88a453cbbe3a305a304954fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0d624981ca616d27057de09511f8a17b643686871e2e77c15bd1c65fba9c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9949e09b836433403be3f3ce9a5226b926ee28efa766add03a7d0f8cef39fb0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac1b1bb4b30aa1d11558ff126ad87e4dd7b9aaf255cd4a1860aacdee30a0bb15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:35Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.818823 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:35Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.830546 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:35Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.842001 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:35Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.853940 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:35Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.867184 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:35Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.881245 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:35Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.892031 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.892081 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.892099 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.892121 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.892136 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:35Z","lastTransitionTime":"2025-10-03T14:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.900724 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:35Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.916207 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:35Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.928494 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:35Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.948640 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:35Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.965860 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16e8c0249ee2448082160abe417f67a17fac176add76d00dd5e755c0f9a6a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:35Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.976821 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:35Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.994833 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.994893 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.994908 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.994929 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:35 crc kubenswrapper[4774]: I1003 14:43:35.994947 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:35Z","lastTransitionTime":"2025-10-03T14:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.096997 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.097038 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.097049 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.097066 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.097077 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:36Z","lastTransitionTime":"2025-10-03T14:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.199715 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.200122 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.200139 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.200165 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.200179 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:36Z","lastTransitionTime":"2025-10-03T14:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.298711 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.298765 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.298843 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:43:36 crc kubenswrapper[4774]: E1003 14:43:36.298941 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:43:36 crc kubenswrapper[4774]: E1003 14:43:36.299049 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:43:36 crc kubenswrapper[4774]: E1003 14:43:36.299265 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.300192 4774 scope.go:117] "RemoveContainer" containerID="91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.302410 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.302461 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.302478 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.302500 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.302518 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:36Z","lastTransitionTime":"2025-10-03T14:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.406230 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.406283 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.406307 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.406334 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.406354 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:36Z","lastTransitionTime":"2025-10-03T14:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.509148 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.509289 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.509463 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.509485 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.509497 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:36Z","lastTransitionTime":"2025-10-03T14:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.554460 4774 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.612720 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.612764 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.612808 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.612830 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.612842 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:36Z","lastTransitionTime":"2025-10-03T14:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.715586 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.715647 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.715667 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.715691 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.715708 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:36Z","lastTransitionTime":"2025-10-03T14:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.817813 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.817870 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.817888 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.817912 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.817931 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:36Z","lastTransitionTime":"2025-10-03T14:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.919988 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.920012 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.920020 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.920031 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:36 crc kubenswrapper[4774]: I1003 14:43:36.920039 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:36Z","lastTransitionTime":"2025-10-03T14:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.023150 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.023201 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.023216 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.023231 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.023241 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:37Z","lastTransitionTime":"2025-10-03T14:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.125990 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.126029 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.126039 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.126054 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.126063 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:37Z","lastTransitionTime":"2025-10-03T14:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.228234 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.228291 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.228311 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.228329 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.228342 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:37Z","lastTransitionTime":"2025-10-03T14:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.330945 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.330993 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.331008 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.331023 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.331032 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:37Z","lastTransitionTime":"2025-10-03T14:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.433289 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.433331 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.433340 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.433354 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.433363 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:37Z","lastTransitionTime":"2025-10-03T14:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.536956 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.537004 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.537014 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.537030 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.537040 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:37Z","lastTransitionTime":"2025-10-03T14:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.558120 4774 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.639996 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.640058 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.640076 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.640098 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.640113 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:37Z","lastTransitionTime":"2025-10-03T14:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.742825 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.742889 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.742902 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.742925 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.742939 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:37Z","lastTransitionTime":"2025-10-03T14:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.846181 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.846225 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.846236 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.846256 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.846292 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:37Z","lastTransitionTime":"2025-10-03T14:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.892725 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:43:37 crc kubenswrapper[4774]: E1003 14:43:37.892940 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:43:53.892924193 +0000 UTC m=+56.482127645 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.948329 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.948383 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.948393 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.948409 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.948418 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:37Z","lastTransitionTime":"2025-10-03T14:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.993792 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.993831 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.993853 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:43:37 crc kubenswrapper[4774]: I1003 14:43:37.993880 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:43:37 crc kubenswrapper[4774]: E1003 14:43:37.993970 4774 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 14:43:37 crc kubenswrapper[4774]: E1003 14:43:37.994037 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 14:43:53.994019284 +0000 UTC m=+56.583222736 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 14:43:37 crc kubenswrapper[4774]: E1003 14:43:37.994040 4774 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 14:43:37 crc kubenswrapper[4774]: E1003 14:43:37.994047 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 14:43:37 crc kubenswrapper[4774]: E1003 14:43:37.994231 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 14:43:37 crc kubenswrapper[4774]: E1003 14:43:37.994246 4774 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:43:37 crc kubenswrapper[4774]: E1003 14:43:37.994047 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 14:43:37 crc kubenswrapper[4774]: E1003 14:43:37.994300 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 14:43:37 crc kubenswrapper[4774]: E1003 14:43:37.994307 4774 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:43:37 crc kubenswrapper[4774]: E1003 14:43:37.994179 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 14:43:53.994154097 +0000 UTC m=+56.583357549 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 14:43:37 crc kubenswrapper[4774]: E1003 14:43:37.994334 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 14:43:53.994325171 +0000 UTC m=+56.583528803 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:43:37 crc kubenswrapper[4774]: E1003 14:43:37.994360 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 14:43:53.994348702 +0000 UTC m=+56.583552354 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.050616 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.050649 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.050657 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.050670 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.050682 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:38Z","lastTransitionTime":"2025-10-03T14:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.153409 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.153453 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.153465 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.153482 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.153494 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:38Z","lastTransitionTime":"2025-10-03T14:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.256915 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.256962 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.256973 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.256993 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.257005 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:38Z","lastTransitionTime":"2025-10-03T14:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.281298 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx"] Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.281686 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.289167 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.294041 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.298820 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.298827 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.298827 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:43:38 crc kubenswrapper[4774]: E1003 14:43:38.299060 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:43:38 crc kubenswrapper[4774]: E1003 14:43:38.298932 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:43:38 crc kubenswrapper[4774]: E1003 14:43:38.299218 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.305141 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:38Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.324518 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd06fa447f9a388f7cabafaff05c8d6dc9aedc161d2097f81855318c8b7712e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b330d308f14c0b3b3630e89fe3e43728e17dbc3fe78cf9cae365a3ab0474c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b575a360d2ee43bc2629635a4ffda77a1b0e1a171bbd99b76b725784be537e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae8d985dec947c332efa41abbc5c30a978f06942cc62a86ab45e90fc2e5d763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aaa74c70ca8d93d672246f3451771bffa1e304c88a453cbbe3a305a304954fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0d624981ca616d27057de09511f8a17b643686871e2e77c15bd1c65fba9c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9949e09b836433403be3f3ce9a5226b926ee28efa766add03a7d0f8cef39fb0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac1b1bb4b30aa1d11558ff126ad87e4dd7b9aaf255cd4a1860aacdee30a0bb15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:38Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.337394 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:38Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.350307 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:38Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.359969 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.360029 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.360042 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.360086 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.360099 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:38Z","lastTransitionTime":"2025-10-03T14:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.363960 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:38Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.374531 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:38Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.387153 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:38Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.398031 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4a01f2ab-7f7c-411c-b424-0d382dee6976-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jlwtx\" (UID: \"4a01f2ab-7f7c-411c-b424-0d382dee6976\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.398082 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4a01f2ab-7f7c-411c-b424-0d382dee6976-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jlwtx\" (UID: \"4a01f2ab-7f7c-411c-b424-0d382dee6976\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.398103 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4a01f2ab-7f7c-411c-b424-0d382dee6976-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jlwtx\" (UID: \"4a01f2ab-7f7c-411c-b424-0d382dee6976\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.398117 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqxln\" (UniqueName: \"kubernetes.io/projected/4a01f2ab-7f7c-411c-b424-0d382dee6976-kube-api-access-wqxln\") pod \"ovnkube-control-plane-749d76644c-jlwtx\" (UID: \"4a01f2ab-7f7c-411c-b424-0d382dee6976\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.399034 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:38Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.409713 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:38Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.433178 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:38Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.444711 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:38Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.455885 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:38Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.463170 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.463211 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.463230 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.463248 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.463258 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:38Z","lastTransitionTime":"2025-10-03T14:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.475635 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:38Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.488871 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16e8c0249ee2448082160abe417f67a17fac176add76d00dd5e755c0f9a6a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:38Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.498297 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:38Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.499041 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4a01f2ab-7f7c-411c-b424-0d382dee6976-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jlwtx\" (UID: \"4a01f2ab-7f7c-411c-b424-0d382dee6976\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.499120 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4a01f2ab-7f7c-411c-b424-0d382dee6976-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jlwtx\" (UID: \"4a01f2ab-7f7c-411c-b424-0d382dee6976\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.499168 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4a01f2ab-7f7c-411c-b424-0d382dee6976-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jlwtx\" (UID: \"4a01f2ab-7f7c-411c-b424-0d382dee6976\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.499200 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqxln\" (UniqueName: \"kubernetes.io/projected/4a01f2ab-7f7c-411c-b424-0d382dee6976-kube-api-access-wqxln\") pod \"ovnkube-control-plane-749d76644c-jlwtx\" (UID: \"4a01f2ab-7f7c-411c-b424-0d382dee6976\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.515622 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a01f2ab-7f7c-411c-b424-0d382dee6976\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jlwtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:38Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.562547 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.564309 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"686819b674279f59619905ac4ba451ca79c97fc652a773307eca3fad2352ddda"} Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.564850 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.565943 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.565973 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.565984 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.565997 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.566007 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:38Z","lastTransitionTime":"2025-10-03T14:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.578637 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4a01f2ab-7f7c-411c-b424-0d382dee6976-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jlwtx\" (UID: \"4a01f2ab-7f7c-411c-b424-0d382dee6976\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.578735 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4a01f2ab-7f7c-411c-b424-0d382dee6976-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jlwtx\" (UID: \"4a01f2ab-7f7c-411c-b424-0d382dee6976\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.580420 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:38Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.583265 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqxln\" (UniqueName: \"kubernetes.io/projected/4a01f2ab-7f7c-411c-b424-0d382dee6976-kube-api-access-wqxln\") pod \"ovnkube-control-plane-749d76644c-jlwtx\" (UID: \"4a01f2ab-7f7c-411c-b424-0d382dee6976\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.583279 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4a01f2ab-7f7c-411c-b424-0d382dee6976-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jlwtx\" (UID: \"4a01f2ab-7f7c-411c-b424-0d382dee6976\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.596933 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:38Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.597211 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.610207 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:38Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.621903 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:38Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.633875 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:38Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.645772 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:38Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.668255 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:38Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.668664 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.668689 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.668701 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.668720 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.668735 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:38Z","lastTransitionTime":"2025-10-03T14:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:38 crc kubenswrapper[4774]: W1003 14:43:38.685074 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a01f2ab_7f7c_411c_b424_0d382dee6976.slice/crio-84f29bde276c20a2ac835be6e6ec190e951241fc9602f7a9b72de3b51de7b506 WatchSource:0}: Error finding container 84f29bde276c20a2ac835be6e6ec190e951241fc9602f7a9b72de3b51de7b506: Status 404 returned error can't find the container with id 84f29bde276c20a2ac835be6e6ec190e951241fc9602f7a9b72de3b51de7b506 Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.687746 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:38Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.697844 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:38Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.714210 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:38Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.728260 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16e8c0249ee2448082160abe417f67a17fac176add76d00dd5e755c0f9a6a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:38Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.737379 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:38Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.753420 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a01f2ab-7f7c-411c-b424-0d382dee6976\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jlwtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:38Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.771473 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.771515 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.771526 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.771543 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.771552 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:38Z","lastTransitionTime":"2025-10-03T14:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.778765 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd06fa447f9a388f7cabafaff05c8d6dc9aedc161d2097f81855318c8b7712e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b330d308f14c0b3b3630e89fe3e43728e17dbc3fe78cf9cae365a3ab0474c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b575a360d2ee43bc2629635a4ffda77a1b0e1a171bbd99b76b725784be537e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae8d985dec947c332efa41abbc5c30a978f06942cc62a86ab45e90fc2e5d763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aaa74c70ca8d93d672246f3451771bffa1e304c88a453cbbe3a305a304954fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0d624981ca616d27057de09511f8a17b643686871e2e77c15bd1c65fba9c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9949e09b836433403be3f3ce9a5226b926ee28efa766add03a7d0f8cef39fb0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac1b1bb4b30aa1d11558ff126ad87e4dd7b9aaf255cd4a1860aacdee30a0bb15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:38Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.791798 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:38Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.807155 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://686819b674279f59619905ac4ba451ca79c97fc652a773307eca3fad2352ddda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:38Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.873734 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.873814 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.873827 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.873844 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.873856 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:38Z","lastTransitionTime":"2025-10-03T14:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.975901 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.975941 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.975953 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.975968 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:38 crc kubenswrapper[4774]: I1003 14:43:38.975979 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:38Z","lastTransitionTime":"2025-10-03T14:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.077992 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.078027 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.078039 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.078055 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.078065 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:39Z","lastTransitionTime":"2025-10-03T14:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.180961 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.181000 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.181008 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.181023 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.181033 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:39Z","lastTransitionTime":"2025-10-03T14:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.283533 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.283608 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.283626 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.283658 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.283676 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:39Z","lastTransitionTime":"2025-10-03T14:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.311770 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a01f2ab-7f7c-411c-b424-0d382dee6976\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jlwtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:39Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.327223 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:39Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.344696 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://686819b674279f59619905ac4ba451ca79c97fc652a773307eca3fad2352ddda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:39Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.365487 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd06fa447f9a388f7cabafaff05c8d6dc9aedc161d2097f81855318c8b7712e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b330d308f14c0b3b3630e89fe3e43728e17dbc3fe78cf9cae365a3ab0474c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b575a360d2ee43bc2629635a4ffda77a1b0e1a171bbd99b76b725784be537e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae8d985dec947c332efa41abbc5c30a978f06942cc62a86ab45e90fc2e5d763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aaa74c70ca8d93d672246f3451771bffa1e304c88a453cbbe3a305a304954fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0d624981ca616d27057de09511f8a17b643686871e2e77c15bd1c65fba9c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9949e09b836433403be3f3ce9a5226b926ee28efa766add03a7d0f8cef39fb0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac1b1bb4b30aa1d11558ff126ad87e4dd7b9aaf255cd4a1860aacdee30a0bb15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:39Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.384103 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:39Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.386008 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.386036 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.386044 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.386057 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.386065 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:39Z","lastTransitionTime":"2025-10-03T14:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.397200 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:39Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.409080 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:39Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.421127 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:39Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.436553 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:39Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.449634 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:39Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.458262 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:39Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.477661 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:39Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.488888 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.488942 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.488953 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.488971 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.488981 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:39Z","lastTransitionTime":"2025-10-03T14:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.492180 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:39Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.502993 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:39Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.518882 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:39Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.535456 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16e8c0249ee2448082160abe417f67a17fac176add76d00dd5e755c0f9a6a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:39Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.567667 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" event={"ID":"4a01f2ab-7f7c-411c-b424-0d382dee6976","Type":"ContainerStarted","Data":"84f29bde276c20a2ac835be6e6ec190e951241fc9602f7a9b72de3b51de7b506"} Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.591097 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.591152 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.591169 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.591228 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.591243 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:39Z","lastTransitionTime":"2025-10-03T14:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.693260 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.693323 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.693337 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.693357 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.693386 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:39Z","lastTransitionTime":"2025-10-03T14:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.796584 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.796627 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.796640 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.796656 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.796670 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:39Z","lastTransitionTime":"2025-10-03T14:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.898863 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.898908 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.898920 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.898936 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:39 crc kubenswrapper[4774]: I1003 14:43:39.898948 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:39Z","lastTransitionTime":"2025-10-03T14:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.000872 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.000947 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.000970 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.000998 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.001020 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:40Z","lastTransitionTime":"2025-10-03T14:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.103440 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.103491 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.103503 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.103521 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.103533 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:40Z","lastTransitionTime":"2025-10-03T14:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.124278 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-ghf5t"] Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.124751 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:43:40 crc kubenswrapper[4774]: E1003 14:43:40.124817 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.136700 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ghf5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d3c89f-9fbd-4d50-840a-c5c78528c903\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ghf5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:40Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.154338 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:40Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.168516 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://686819b674279f59619905ac4ba451ca79c97fc652a773307eca3fad2352ddda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:40Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.192025 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd06fa447f9a388f7cabafaff05c8d6dc9aedc161d2097f81855318c8b7712e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b330d308f14c0b3b3630e89fe3e43728e17dbc3fe78cf9cae365a3ab0474c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b575a360d2ee43bc2629635a4ffda77a1b0e1a171bbd99b76b725784be537e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae8d985dec947c332efa41abbc5c30a978f06942cc62a86ab45e90fc2e5d763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aaa74c70ca8d93d672246f3451771bffa1e304c88a453cbbe3a305a304954fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0d624981ca616d27057de09511f8a17b643686871e2e77c15bd1c65fba9c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9949e09b836433403be3f3ce9a5226b926ee28efa766add03a7d0f8cef39fb0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac1b1bb4b30aa1d11558ff126ad87e4dd7b9aaf255cd4a1860aacdee30a0bb15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:40Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.206551 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.206592 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.206610 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.206625 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.206635 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:40Z","lastTransitionTime":"2025-10-03T14:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.212392 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnpd7\" (UniqueName: \"kubernetes.io/projected/88d3c89f-9fbd-4d50-840a-c5c78528c903-kube-api-access-vnpd7\") pod \"network-metrics-daemon-ghf5t\" (UID: \"88d3c89f-9fbd-4d50-840a-c5c78528c903\") " pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.212433 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88d3c89f-9fbd-4d50-840a-c5c78528c903-metrics-certs\") pod \"network-metrics-daemon-ghf5t\" (UID: \"88d3c89f-9fbd-4d50-840a-c5c78528c903\") " pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.214931 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:40Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.229453 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:40Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.242716 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:40Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.262777 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:40Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.277357 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:40Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.293191 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:40Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.298677 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.298712 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.298716 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:43:40 crc kubenswrapper[4774]: E1003 14:43:40.298812 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:43:40 crc kubenswrapper[4774]: E1003 14:43:40.298918 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:43:40 crc kubenswrapper[4774]: E1003 14:43:40.299007 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.310027 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:40Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.311998 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.312036 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.312054 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.312076 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.312090 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:40Z","lastTransitionTime":"2025-10-03T14:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.313337 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88d3c89f-9fbd-4d50-840a-c5c78528c903-metrics-certs\") pod \"network-metrics-daemon-ghf5t\" (UID: \"88d3c89f-9fbd-4d50-840a-c5c78528c903\") " pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.313390 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnpd7\" (UniqueName: \"kubernetes.io/projected/88d3c89f-9fbd-4d50-840a-c5c78528c903-kube-api-access-vnpd7\") pod \"network-metrics-daemon-ghf5t\" (UID: \"88d3c89f-9fbd-4d50-840a-c5c78528c903\") " pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:43:40 crc kubenswrapper[4774]: E1003 14:43:40.313573 4774 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 14:43:40 crc kubenswrapper[4774]: E1003 14:43:40.313670 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88d3c89f-9fbd-4d50-840a-c5c78528c903-metrics-certs podName:88d3c89f-9fbd-4d50-840a-c5c78528c903 nodeName:}" failed. No retries permitted until 2025-10-03 14:43:40.813645538 +0000 UTC m=+43.402849020 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/88d3c89f-9fbd-4d50-840a-c5c78528c903-metrics-certs") pod "network-metrics-daemon-ghf5t" (UID: "88d3c89f-9fbd-4d50-840a-c5c78528c903") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.323936 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:40Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.337131 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:40Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.338990 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnpd7\" (UniqueName: \"kubernetes.io/projected/88d3c89f-9fbd-4d50-840a-c5c78528c903-kube-api-access-vnpd7\") pod \"network-metrics-daemon-ghf5t\" (UID: \"88d3c89f-9fbd-4d50-840a-c5c78528c903\") " pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.353325 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16e8c0249ee2448082160abe417f67a17fac176add76d00dd5e755c0f9a6a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:40Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.365153 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:40Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.385632 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:40Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.395834 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a01f2ab-7f7c-411c-b424-0d382dee6976\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jlwtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:40Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.414999 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.415058 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.415071 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.415088 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.415099 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:40Z","lastTransitionTime":"2025-10-03T14:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.517727 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.518037 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.518046 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.518059 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.518068 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:40Z","lastTransitionTime":"2025-10-03T14:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.620588 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.620633 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.620644 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.620661 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.620672 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:40Z","lastTransitionTime":"2025-10-03T14:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.728985 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.729033 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.729044 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.729062 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.729073 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:40Z","lastTransitionTime":"2025-10-03T14:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.817527 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88d3c89f-9fbd-4d50-840a-c5c78528c903-metrics-certs\") pod \"network-metrics-daemon-ghf5t\" (UID: \"88d3c89f-9fbd-4d50-840a-c5c78528c903\") " pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:43:40 crc kubenswrapper[4774]: E1003 14:43:40.817676 4774 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 14:43:40 crc kubenswrapper[4774]: E1003 14:43:40.817747 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88d3c89f-9fbd-4d50-840a-c5c78528c903-metrics-certs podName:88d3c89f-9fbd-4d50-840a-c5c78528c903 nodeName:}" failed. No retries permitted until 2025-10-03 14:43:41.817721984 +0000 UTC m=+44.406925436 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/88d3c89f-9fbd-4d50-840a-c5c78528c903-metrics-certs") pod "network-metrics-daemon-ghf5t" (UID: "88d3c89f-9fbd-4d50-840a-c5c78528c903") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.831737 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.831792 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.831803 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.831823 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.831840 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:40Z","lastTransitionTime":"2025-10-03T14:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.935418 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.935451 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.935463 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.935480 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:40 crc kubenswrapper[4774]: I1003 14:43:40.935493 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:40Z","lastTransitionTime":"2025-10-03T14:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.037883 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.037944 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.037956 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.037971 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.037983 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:41Z","lastTransitionTime":"2025-10-03T14:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.140201 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.140245 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.140258 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.140284 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.140298 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:41Z","lastTransitionTime":"2025-10-03T14:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.242136 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.242166 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.242175 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.242187 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.242196 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:41Z","lastTransitionTime":"2025-10-03T14:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.299450 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:43:41 crc kubenswrapper[4774]: E1003 14:43:41.299602 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.344689 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.344737 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.344754 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.344775 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.344795 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:41Z","lastTransitionTime":"2025-10-03T14:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.447547 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.447580 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.447723 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.447740 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.447757 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:41Z","lastTransitionTime":"2025-10-03T14:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.550249 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.550295 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.550305 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.550320 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.550334 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:41Z","lastTransitionTime":"2025-10-03T14:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.575082 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzv75_01bef0c3-23b3-4f49-8c33-3f2ec7503b12/ovnkube-controller/0.log" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.577809 4774 generic.go:334] "Generic (PLEG): container finished" podID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerID="9949e09b836433403be3f3ce9a5226b926ee28efa766add03a7d0f8cef39fb0b" exitCode=1 Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.577880 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" event={"ID":"01bef0c3-23b3-4f49-8c33-3f2ec7503b12","Type":"ContainerDied","Data":"9949e09b836433403be3f3ce9a5226b926ee28efa766add03a7d0f8cef39fb0b"} Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.578905 4774 scope.go:117] "RemoveContainer" containerID="9949e09b836433403be3f3ce9a5226b926ee28efa766add03a7d0f8cef39fb0b" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.579711 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" event={"ID":"4a01f2ab-7f7c-411c-b424-0d382dee6976","Type":"ContainerStarted","Data":"3803c97bfaf2ffb04a8a6924cb97ee022c7d34a38e1a50ce63a7b8f062fc901d"} Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.579738 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" event={"ID":"4a01f2ab-7f7c-411c-b424-0d382dee6976","Type":"ContainerStarted","Data":"5b65b48ad13df81dd9ccb04fd8aeebbf976409d5d318f51a923c2dfc0cd8cc7b"} Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.590382 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.613950 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.626136 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.637664 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.648169 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.652019 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.652052 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.652061 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.652074 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.652081 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:41Z","lastTransitionTime":"2025-10-03T14:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.662481 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16e8c0249ee2448082160abe417f67a17fac176add76d00dd5e755c0f9a6a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.672908 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a01f2ab-7f7c-411c-b424-0d382dee6976\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jlwtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.684269 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.695432 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://686819b674279f59619905ac4ba451ca79c97fc652a773307eca3fad2352ddda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.711628 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd06fa447f9a388f7cabafaff05c8d6dc9aedc161d2097f81855318c8b7712e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b330d308f14c0b3b3630e89fe3e43728e17dbc3fe78cf9cae365a3ab0474c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b575a360d2ee43bc2629635a4ffda77a1b0e1a171bbd99b76b725784be537e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae8d985dec947c332efa41abbc5c30a978f06942cc62a86ab45e90fc2e5d763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aaa74c70ca8d93d672246f3451771bffa1e304c88a453cbbe3a305a304954fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0d624981ca616d27057de09511f8a17b643686871e2e77c15bd1c65fba9c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9949e09b836433403be3f3ce9a5226b926ee28efa766add03a7d0f8cef39fb0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9949e09b836433403be3f3ce9a5226b926ee28efa766add03a7d0f8cef39fb0b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"message\\\":\\\"1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 14:43:41.448023 6091 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1003 14:43:41.448068 6091 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1003 14:43:41.448186 6091 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1003 14:43:41.448517 6091 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1003 14:43:41.448572 6091 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1003 14:43:41.448578 6091 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1003 14:43:41.448595 6091 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 14:43:41.448615 6091 factory.go:656] Stopping watch factory\\\\nI1003 14:43:41.448627 6091 ovnkube.go:599] Stopped ovnkube\\\\nI1003 14:43:41.448661 6091 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 14:43:41.448662 6091 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 14:43:41.448676 6091 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac1b1bb4b30aa1d11558ff126ad87e4dd7b9aaf255cd4a1860aacdee30a0bb15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.723188 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ghf5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d3c89f-9fbd-4d50-840a-c5c78528c903\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ghf5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.738611 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.752884 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.754577 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.754608 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.754619 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.754637 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.754648 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:41Z","lastTransitionTime":"2025-10-03T14:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.767861 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.779132 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.793455 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.804890 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.825630 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd06fa447f9a388f7cabafaff05c8d6dc9aedc161d2097f81855318c8b7712e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b330d308f14c0b3b3630e89fe3e43728e17dbc3fe78cf9cae365a3ab0474c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b575a360d2ee43bc2629635a4ffda77a1b0e1a171bbd99b76b725784be537e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae8d985dec947c332efa41abbc5c30a978f06942cc62a86ab45e90fc2e5d763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aaa74c70ca8d93d672246f3451771bffa1e304c88a453cbbe3a305a304954fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0d624981ca616d27057de09511f8a17b643686871e2e77c15bd1c65fba9c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9949e09b836433403be3f3ce9a5226b926ee28efa766add03a7d0f8cef39fb0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9949e09b836433403be3f3ce9a5226b926ee28efa766add03a7d0f8cef39fb0b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"message\\\":\\\"1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 14:43:41.448023 6091 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1003 14:43:41.448068 6091 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1003 14:43:41.448186 6091 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1003 14:43:41.448517 6091 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1003 14:43:41.448572 6091 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1003 14:43:41.448578 6091 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1003 14:43:41.448595 6091 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 14:43:41.448615 6091 factory.go:656] Stopping watch factory\\\\nI1003 14:43:41.448627 6091 ovnkube.go:599] Stopped ovnkube\\\\nI1003 14:43:41.448661 6091 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 14:43:41.448662 6091 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 14:43:41.448676 6091 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac1b1bb4b30aa1d11558ff126ad87e4dd7b9aaf255cd4a1860aacdee30a0bb15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.827685 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88d3c89f-9fbd-4d50-840a-c5c78528c903-metrics-certs\") pod \"network-metrics-daemon-ghf5t\" (UID: \"88d3c89f-9fbd-4d50-840a-c5c78528c903\") " pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:43:41 crc kubenswrapper[4774]: E1003 14:43:41.827815 4774 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 14:43:41 crc kubenswrapper[4774]: E1003 14:43:41.827908 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88d3c89f-9fbd-4d50-840a-c5c78528c903-metrics-certs podName:88d3c89f-9fbd-4d50-840a-c5c78528c903 nodeName:}" failed. No retries permitted until 2025-10-03 14:43:43.827888113 +0000 UTC m=+46.417091565 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/88d3c89f-9fbd-4d50-840a-c5c78528c903-metrics-certs") pod "network-metrics-daemon-ghf5t" (UID: "88d3c89f-9fbd-4d50-840a-c5c78528c903") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.836617 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ghf5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d3c89f-9fbd-4d50-840a-c5c78528c903\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ghf5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.848271 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.857197 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.857223 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.857233 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.857249 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.857259 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:41Z","lastTransitionTime":"2025-10-03T14:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.863908 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://686819b674279f59619905ac4ba451ca79c97fc652a773307eca3fad2352ddda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.885016 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.898735 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.912213 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.925311 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.938979 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.951852 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.962155 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.962215 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.962224 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.962237 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.962246 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:41Z","lastTransitionTime":"2025-10-03T14:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.968566 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.982907 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:41 crc kubenswrapper[4774]: I1003 14:43:41.994858 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.013213 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.030134 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16e8c0249ee2448082160abe417f67a17fac176add76d00dd5e755c0f9a6a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.042323 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.053617 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a01f2ab-7f7c-411c-b424-0d382dee6976\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b65b48ad13df81dd9ccb04fd8aeebbf976409d5d318f51a923c2dfc0cd8cc7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3803c97bfaf2ffb04a8a6924cb97ee022c7d34a38e1a50ce63a7b8f062fc901d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jlwtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.070693 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.070728 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.070738 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.070751 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.070760 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:42Z","lastTransitionTime":"2025-10-03T14:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.173506 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.173555 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.173568 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.173595 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.173611 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:42Z","lastTransitionTime":"2025-10-03T14:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.275254 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.275288 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.275299 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.275315 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.275325 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:42Z","lastTransitionTime":"2025-10-03T14:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.298560 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.298582 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.298631 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:43:42 crc kubenswrapper[4774]: E1003 14:43:42.298708 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:43:42 crc kubenswrapper[4774]: E1003 14:43:42.298798 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:43:42 crc kubenswrapper[4774]: E1003 14:43:42.298896 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.378736 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.378775 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.378785 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.378800 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.378813 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:42Z","lastTransitionTime":"2025-10-03T14:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.481113 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.481161 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.481173 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.481194 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.481211 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:42Z","lastTransitionTime":"2025-10-03T14:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.582615 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.582649 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.582658 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.582672 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.582681 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:42Z","lastTransitionTime":"2025-10-03T14:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.585535 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzv75_01bef0c3-23b3-4f49-8c33-3f2ec7503b12/ovnkube-controller/0.log" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.588974 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" event={"ID":"01bef0c3-23b3-4f49-8c33-3f2ec7503b12","Type":"ContainerStarted","Data":"32a9edd2bc21611da46c1c1e6a5412feb00ae4b15218128d6ed5e68f906db2c1"} Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.589080 4774 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.601624 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.612875 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.624805 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.636727 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.646734 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.657140 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.669398 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.683719 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16e8c0249ee2448082160abe417f67a17fac176add76d00dd5e755c0f9a6a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.686655 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.686683 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.686691 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.686704 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.686713 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:42Z","lastTransitionTime":"2025-10-03T14:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.693812 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.716116 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.729589 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.740803 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.753913 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a01f2ab-7f7c-411c-b424-0d382dee6976\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b65b48ad13df81dd9ccb04fd8aeebbf976409d5d318f51a923c2dfc0cd8cc7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3803c97bfaf2ffb04a8a6924cb97ee022c7d34a38e1a50ce63a7b8f062fc901d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jlwtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.766040 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.779225 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://686819b674279f59619905ac4ba451ca79c97fc652a773307eca3fad2352ddda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.789683 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.789717 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.789727 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.789740 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.789750 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:42Z","lastTransitionTime":"2025-10-03T14:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.790682 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.790711 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.790723 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.790736 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.790746 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:42Z","lastTransitionTime":"2025-10-03T14:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.797045 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd06fa447f9a388f7cabafaff05c8d6dc9aedc161d2097f81855318c8b7712e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b330d308f14c0b3b3630e89fe3e43728e17dbc3fe78cf9cae365a3ab0474c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b575a360d2ee43bc2629635a4ffda77a1b0e1a171bbd99b76b725784be537e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae8d985dec947c332efa41abbc5c30a978f06942cc62a86ab45e90fc2e5d763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aaa74c70ca8d93d672246f3451771bffa1e304c88a453cbbe3a305a304954fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0d624981ca616d27057de09511f8a17b643686871e2e77c15bd1c65fba9c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32a9edd2bc21611da46c1c1e6a5412feb00ae4b15218128d6ed5e68f906db2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9949e09b836433403be3f3ce9a5226b926ee28efa766add03a7d0f8cef39fb0b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"message\\\":\\\"1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 14:43:41.448023 6091 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1003 14:43:41.448068 6091 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1003 14:43:41.448186 6091 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1003 14:43:41.448517 6091 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1003 14:43:41.448572 6091 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1003 14:43:41.448578 6091 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1003 14:43:41.448595 6091 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 14:43:41.448615 6091 factory.go:656] Stopping watch factory\\\\nI1003 14:43:41.448627 6091 ovnkube.go:599] Stopped ovnkube\\\\nI1003 14:43:41.448661 6091 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 14:43:41.448662 6091 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 14:43:41.448676 6091 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac1b1bb4b30aa1d11558ff126ad87e4dd7b9aaf255cd4a1860aacdee30a0bb15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:42 crc kubenswrapper[4774]: E1003 14:43:42.801569 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.805170 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.805206 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.805216 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.805232 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.805241 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:42Z","lastTransitionTime":"2025-10-03T14:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.808164 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ghf5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d3c89f-9fbd-4d50-840a-c5c78528c903\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ghf5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:42 crc kubenswrapper[4774]: E1003 14:43:42.816123 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.819417 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.819447 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.819456 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.819471 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.819481 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:42Z","lastTransitionTime":"2025-10-03T14:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:42 crc kubenswrapper[4774]: E1003 14:43:42.829343 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.834957 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.835028 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.835048 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.835077 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.835107 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:42Z","lastTransitionTime":"2025-10-03T14:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:42 crc kubenswrapper[4774]: E1003 14:43:42.847107 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.850566 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.850645 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.850672 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.850702 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.850724 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:42Z","lastTransitionTime":"2025-10-03T14:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:42 crc kubenswrapper[4774]: E1003 14:43:42.871622 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:42 crc kubenswrapper[4774]: E1003 14:43:42.871740 4774 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.892499 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.892550 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.892562 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.892582 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.892594 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:42Z","lastTransitionTime":"2025-10-03T14:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.995480 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.995532 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.995543 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.995563 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:42 crc kubenswrapper[4774]: I1003 14:43:42.995574 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:42Z","lastTransitionTime":"2025-10-03T14:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.098122 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.098165 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.098176 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.098198 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.098216 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:43Z","lastTransitionTime":"2025-10-03T14:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.200957 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.200995 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.201005 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.201019 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.201028 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:43Z","lastTransitionTime":"2025-10-03T14:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.299449 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:43:43 crc kubenswrapper[4774]: E1003 14:43:43.299654 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.303651 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.303746 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.303772 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.303802 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.303824 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:43Z","lastTransitionTime":"2025-10-03T14:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.407442 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.407502 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.407520 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.407540 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.407554 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:43Z","lastTransitionTime":"2025-10-03T14:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.510744 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.510784 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.510794 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.510809 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.510819 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:43Z","lastTransitionTime":"2025-10-03T14:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.593557 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzv75_01bef0c3-23b3-4f49-8c33-3f2ec7503b12/ovnkube-controller/1.log" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.594510 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzv75_01bef0c3-23b3-4f49-8c33-3f2ec7503b12/ovnkube-controller/0.log" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.598616 4774 generic.go:334] "Generic (PLEG): container finished" podID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerID="32a9edd2bc21611da46c1c1e6a5412feb00ae4b15218128d6ed5e68f906db2c1" exitCode=1 Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.598673 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" event={"ID":"01bef0c3-23b3-4f49-8c33-3f2ec7503b12","Type":"ContainerDied","Data":"32a9edd2bc21611da46c1c1e6a5412feb00ae4b15218128d6ed5e68f906db2c1"} Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.598718 4774 scope.go:117] "RemoveContainer" containerID="9949e09b836433403be3f3ce9a5226b926ee28efa766add03a7d0f8cef39fb0b" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.601140 4774 scope.go:117] "RemoveContainer" containerID="32a9edd2bc21611da46c1c1e6a5412feb00ae4b15218128d6ed5e68f906db2c1" Oct 03 14:43:43 crc kubenswrapper[4774]: E1003 14:43:43.601631 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jzv75_openshift-ovn-kubernetes(01bef0c3-23b3-4f49-8c33-3f2ec7503b12)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.613974 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.614018 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.614027 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.614041 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.614050 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:43Z","lastTransitionTime":"2025-10-03T14:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.630736 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:43Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.643684 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:43Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.656086 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:43Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.668388 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:43Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.682716 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16e8c0249ee2448082160abe417f67a17fac176add76d00dd5e755c0f9a6a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:43Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.693556 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:43Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.707163 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a01f2ab-7f7c-411c-b424-0d382dee6976\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b65b48ad13df81dd9ccb04fd8aeebbf976409d5d318f51a923c2dfc0cd8cc7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3803c97bfaf2ffb04a8a6924cb97ee022c7d34a38e1a50ce63a7b8f062fc901d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jlwtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:43Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.716460 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.716489 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.716497 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.716509 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.716520 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:43Z","lastTransitionTime":"2025-10-03T14:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.720634 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:43Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.733492 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://686819b674279f59619905ac4ba451ca79c97fc652a773307eca3fad2352ddda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:43Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.750133 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd06fa447f9a388f7cabafaff05c8d6dc9aedc161d2097f81855318c8b7712e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b330d308f14c0b3b3630e89fe3e43728e17dbc3fe78cf9cae365a3ab0474c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b575a360d2ee43bc2629635a4ffda77a1b0e1a171bbd99b76b725784be537e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae8d985dec947c332efa41abbc5c30a978f06942cc62a86ab45e90fc2e5d763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aaa74c70ca8d93d672246f3451771bffa1e304c88a453cbbe3a305a304954fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0d624981ca616d27057de09511f8a17b643686871e2e77c15bd1c65fba9c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32a9edd2bc21611da46c1c1e6a5412feb00ae4b15218128d6ed5e68f906db2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9949e09b836433403be3f3ce9a5226b926ee28efa766add03a7d0f8cef39fb0b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"message\\\":\\\"1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 14:43:41.448023 6091 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1003 14:43:41.448068 6091 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1003 14:43:41.448186 6091 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1003 14:43:41.448517 6091 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1003 14:43:41.448572 6091 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1003 14:43:41.448578 6091 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1003 14:43:41.448595 6091 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 14:43:41.448615 6091 factory.go:656] Stopping watch factory\\\\nI1003 14:43:41.448627 6091 ovnkube.go:599] Stopped ovnkube\\\\nI1003 14:43:41.448661 6091 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 14:43:41.448662 6091 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 14:43:41.448676 6091 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32a9edd2bc21611da46c1c1e6a5412feb00ae4b15218128d6ed5e68f906db2c1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"message\\\":\\\" []services.lbConfig(nil)\\\\nI1003 14:43:42.614795 6308 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1003 14:43:42.614886 6308 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1003 14:43:42.616482 6308 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1003 14:43:42.616487 6308 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1003 14:43:42.614714 6308 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1003 14:43:42.614799 6308 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac1b1bb4b30aa1d11558ff126ad87e4dd7b9aaf255cd4a1860aacdee30a0bb15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:43Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.758206 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ghf5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d3c89f-9fbd-4d50-840a-c5c78528c903\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ghf5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:43Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.768754 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:43Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.781161 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:43Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.791072 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:43Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.800433 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:43Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.809900 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:43Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.818791 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.818984 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.819075 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.819063 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:43Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.819154 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.819185 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:43Z","lastTransitionTime":"2025-10-03T14:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.847826 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88d3c89f-9fbd-4d50-840a-c5c78528c903-metrics-certs\") pod \"network-metrics-daemon-ghf5t\" (UID: \"88d3c89f-9fbd-4d50-840a-c5c78528c903\") " pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:43:43 crc kubenswrapper[4774]: E1003 14:43:43.847957 4774 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 14:43:43 crc kubenswrapper[4774]: E1003 14:43:43.848018 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88d3c89f-9fbd-4d50-840a-c5c78528c903-metrics-certs podName:88d3c89f-9fbd-4d50-840a-c5c78528c903 nodeName:}" failed. No retries permitted until 2025-10-03 14:43:47.848003648 +0000 UTC m=+50.437207100 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/88d3c89f-9fbd-4d50-840a-c5c78528c903-metrics-certs") pod "network-metrics-daemon-ghf5t" (UID: "88d3c89f-9fbd-4d50-840a-c5c78528c903") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.921695 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.921756 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.921766 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.921782 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:43 crc kubenswrapper[4774]: I1003 14:43:43.921793 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:43Z","lastTransitionTime":"2025-10-03T14:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.024585 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.024661 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.024684 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.024711 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.024732 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:44Z","lastTransitionTime":"2025-10-03T14:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.127717 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.127802 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.127825 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.127840 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.127850 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:44Z","lastTransitionTime":"2025-10-03T14:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.230829 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.230875 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.230886 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.230904 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.230917 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:44Z","lastTransitionTime":"2025-10-03T14:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.299090 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.299225 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.299365 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:43:44 crc kubenswrapper[4774]: E1003 14:43:44.299458 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:43:44 crc kubenswrapper[4774]: E1003 14:43:44.299545 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:43:44 crc kubenswrapper[4774]: E1003 14:43:44.299235 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.335031 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.335069 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.335077 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.335090 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.335100 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:44Z","lastTransitionTime":"2025-10-03T14:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.438168 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.438245 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.438270 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.438304 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.438329 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:44Z","lastTransitionTime":"2025-10-03T14:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.541748 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.541842 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.541862 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.541887 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.541903 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:44Z","lastTransitionTime":"2025-10-03T14:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.604899 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzv75_01bef0c3-23b3-4f49-8c33-3f2ec7503b12/ovnkube-controller/1.log" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.644200 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.644270 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.644294 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.644329 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.644352 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:44Z","lastTransitionTime":"2025-10-03T14:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.747137 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.747200 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.747217 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.747240 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.747256 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:44Z","lastTransitionTime":"2025-10-03T14:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.849691 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.849732 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.849745 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.849762 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.849774 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:44Z","lastTransitionTime":"2025-10-03T14:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.953005 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.953130 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.953156 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.953185 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:44 crc kubenswrapper[4774]: I1003 14:43:44.953207 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:44Z","lastTransitionTime":"2025-10-03T14:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.055419 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.055464 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.055473 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.055486 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.055497 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:45Z","lastTransitionTime":"2025-10-03T14:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.158259 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.158313 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.158325 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.158339 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.158351 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:45Z","lastTransitionTime":"2025-10-03T14:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.261726 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.261805 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.261820 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.261839 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.261851 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:45Z","lastTransitionTime":"2025-10-03T14:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.298763 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:43:45 crc kubenswrapper[4774]: E1003 14:43:45.298983 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.364680 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.364721 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.364732 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.364748 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.364759 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:45Z","lastTransitionTime":"2025-10-03T14:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.467169 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.467237 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.467253 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.467276 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.467295 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:45Z","lastTransitionTime":"2025-10-03T14:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.571264 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.571318 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.571332 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.571349 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.571362 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:45Z","lastTransitionTime":"2025-10-03T14:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.673728 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.673788 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.673805 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.673830 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.673850 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:45Z","lastTransitionTime":"2025-10-03T14:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.775889 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.776001 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.776020 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.776043 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.776060 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:45Z","lastTransitionTime":"2025-10-03T14:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.879347 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.879421 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.879444 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.879475 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.879493 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:45Z","lastTransitionTime":"2025-10-03T14:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.982263 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.982302 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.982315 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.982331 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:45 crc kubenswrapper[4774]: I1003 14:43:45.982343 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:45Z","lastTransitionTime":"2025-10-03T14:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.086340 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.086445 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.086473 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.086502 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.086523 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:46Z","lastTransitionTime":"2025-10-03T14:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.189073 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.189135 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.189144 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.189161 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.189171 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:46Z","lastTransitionTime":"2025-10-03T14:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.292035 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.292075 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.292086 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.292100 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.292109 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:46Z","lastTransitionTime":"2025-10-03T14:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.299352 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.299368 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.299398 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:43:46 crc kubenswrapper[4774]: E1003 14:43:46.299494 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:43:46 crc kubenswrapper[4774]: E1003 14:43:46.299703 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:43:46 crc kubenswrapper[4774]: E1003 14:43:46.299771 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.395048 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.395110 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.395128 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.395152 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.395172 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:46Z","lastTransitionTime":"2025-10-03T14:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.498161 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.498199 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.498208 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.498224 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.498235 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:46Z","lastTransitionTime":"2025-10-03T14:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.600448 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.600494 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.600505 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.600522 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.600534 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:46Z","lastTransitionTime":"2025-10-03T14:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.702808 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.702858 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.702867 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.702881 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.702890 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:46Z","lastTransitionTime":"2025-10-03T14:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.806003 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.806095 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.806119 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.806152 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.806177 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:46Z","lastTransitionTime":"2025-10-03T14:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.908495 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.908529 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.908539 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.908552 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:46 crc kubenswrapper[4774]: I1003 14:43:46.908561 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:46Z","lastTransitionTime":"2025-10-03T14:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.010922 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.010966 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.010978 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.010992 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.011004 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:47Z","lastTransitionTime":"2025-10-03T14:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.114278 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.114731 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.114743 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.114763 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.114774 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:47Z","lastTransitionTime":"2025-10-03T14:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.218036 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.218105 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.218128 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.218154 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.218199 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:47Z","lastTransitionTime":"2025-10-03T14:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.299004 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:43:47 crc kubenswrapper[4774]: E1003 14:43:47.299144 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.319973 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.320008 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.320017 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.320030 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.320040 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:47Z","lastTransitionTime":"2025-10-03T14:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.423027 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.423079 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.423101 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.423118 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.423129 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:47Z","lastTransitionTime":"2025-10-03T14:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.525433 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.525475 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.525488 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.525505 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.525517 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:47Z","lastTransitionTime":"2025-10-03T14:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.627676 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.627711 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.627721 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.627734 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.627743 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:47Z","lastTransitionTime":"2025-10-03T14:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.731273 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.731344 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.731366 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.731474 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.731547 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:47Z","lastTransitionTime":"2025-10-03T14:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.834188 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.834256 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.834277 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.834413 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.834434 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:47Z","lastTransitionTime":"2025-10-03T14:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.894758 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88d3c89f-9fbd-4d50-840a-c5c78528c903-metrics-certs\") pod \"network-metrics-daemon-ghf5t\" (UID: \"88d3c89f-9fbd-4d50-840a-c5c78528c903\") " pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:43:47 crc kubenswrapper[4774]: E1003 14:43:47.894926 4774 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 14:43:47 crc kubenswrapper[4774]: E1003 14:43:47.894981 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88d3c89f-9fbd-4d50-840a-c5c78528c903-metrics-certs podName:88d3c89f-9fbd-4d50-840a-c5c78528c903 nodeName:}" failed. No retries permitted until 2025-10-03 14:43:55.894966672 +0000 UTC m=+58.484170124 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/88d3c89f-9fbd-4d50-840a-c5c78528c903-metrics-certs") pod "network-metrics-daemon-ghf5t" (UID: "88d3c89f-9fbd-4d50-840a-c5c78528c903") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.936219 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.936466 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.936557 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.936674 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:47 crc kubenswrapper[4774]: I1003 14:43:47.936868 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:47Z","lastTransitionTime":"2025-10-03T14:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.038944 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.038985 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.038997 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.039045 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.039058 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:48Z","lastTransitionTime":"2025-10-03T14:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.141170 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.141237 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.141259 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.141284 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.141302 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:48Z","lastTransitionTime":"2025-10-03T14:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.243992 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.244023 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.244037 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.244052 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.244064 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:48Z","lastTransitionTime":"2025-10-03T14:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.298355 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.298428 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.298890 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:43:48 crc kubenswrapper[4774]: E1003 14:43:48.299010 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:43:48 crc kubenswrapper[4774]: E1003 14:43:48.299156 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:43:48 crc kubenswrapper[4774]: E1003 14:43:48.299580 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.347167 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.347209 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.347224 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.347243 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.347258 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:48Z","lastTransitionTime":"2025-10-03T14:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.450190 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.450250 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.450263 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.450284 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.450297 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:48Z","lastTransitionTime":"2025-10-03T14:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.552450 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.552495 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.552507 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.552525 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.552536 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:48Z","lastTransitionTime":"2025-10-03T14:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.654832 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.654895 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.654918 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.654947 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.654969 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:48Z","lastTransitionTime":"2025-10-03T14:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.758012 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.758059 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.758069 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.758083 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.758093 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:48Z","lastTransitionTime":"2025-10-03T14:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.860413 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.860473 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.860490 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.860514 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.860534 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:48Z","lastTransitionTime":"2025-10-03T14:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.962582 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.962616 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.962633 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.962649 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:48 crc kubenswrapper[4774]: I1003 14:43:48.962660 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:48Z","lastTransitionTime":"2025-10-03T14:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.065831 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.065883 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.065893 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.065908 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.065922 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:49Z","lastTransitionTime":"2025-10-03T14:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.169018 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.169094 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.169120 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.169151 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.169168 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:49Z","lastTransitionTime":"2025-10-03T14:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.272387 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.272434 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.272447 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.272466 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.272480 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:49Z","lastTransitionTime":"2025-10-03T14:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.299020 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:43:49 crc kubenswrapper[4774]: E1003 14:43:49.299206 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.318306 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.339512 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.354606 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.369242 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.374204 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.374241 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.374254 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.374270 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.374282 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:49Z","lastTransitionTime":"2025-10-03T14:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.383935 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.403171 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.422322 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.441921 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.454100 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.468522 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.476746 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.476797 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.476810 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.476825 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.476835 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:49Z","lastTransitionTime":"2025-10-03T14:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.487199 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16e8c0249ee2448082160abe417f67a17fac176add76d00dd5e755c0f9a6a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.500742 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.520101 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a01f2ab-7f7c-411c-b424-0d382dee6976\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b65b48ad13df81dd9ccb04fd8aeebbf976409d5d318f51a923c2dfc0cd8cc7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3803c97bfaf2ffb04a8a6924cb97ee022c7d34a38e1a50ce63a7b8f062fc901d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jlwtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.538883 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd06fa447f9a388f7cabafaff05c8d6dc9aedc161d2097f81855318c8b7712e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b330d308f14c0b3b3630e89fe3e43728e17dbc3fe78cf9cae365a3ab0474c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b575a360d2ee43bc2629635a4ffda77a1b0e1a171bbd99b76b725784be537e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae8d985dec947c332efa41abbc5c30a978f06942cc62a86ab45e90fc2e5d763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aaa74c70ca8d93d672246f3451771bffa1e304c88a453cbbe3a305a304954fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0d624981ca616d27057de09511f8a17b643686871e2e77c15bd1c65fba9c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32a9edd2bc21611da46c1c1e6a5412feb00ae4b15218128d6ed5e68f906db2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9949e09b836433403be3f3ce9a5226b926ee28efa766add03a7d0f8cef39fb0b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"message\\\":\\\"1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 14:43:41.448023 6091 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1003 14:43:41.448068 6091 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1003 14:43:41.448186 6091 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1003 14:43:41.448517 6091 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1003 14:43:41.448572 6091 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1003 14:43:41.448578 6091 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1003 14:43:41.448595 6091 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 14:43:41.448615 6091 factory.go:656] Stopping watch factory\\\\nI1003 14:43:41.448627 6091 ovnkube.go:599] Stopped ovnkube\\\\nI1003 14:43:41.448661 6091 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 14:43:41.448662 6091 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 14:43:41.448676 6091 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32a9edd2bc21611da46c1c1e6a5412feb00ae4b15218128d6ed5e68f906db2c1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"message\\\":\\\" []services.lbConfig(nil)\\\\nI1003 14:43:42.614795 6308 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1003 14:43:42.614886 6308 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1003 14:43:42.616482 6308 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1003 14:43:42.616487 6308 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1003 14:43:42.614714 6308 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1003 14:43:42.614799 6308 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac1b1bb4b30aa1d11558ff126ad87e4dd7b9aaf255cd4a1860aacdee30a0bb15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.555027 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ghf5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d3c89f-9fbd-4d50-840a-c5c78528c903\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ghf5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.572148 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.575510 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.580080 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.580149 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.580172 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.580201 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.580220 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:49Z","lastTransitionTime":"2025-10-03T14:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.590644 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://686819b674279f59619905ac4ba451ca79c97fc652a773307eca3fad2352ddda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.604926 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.620296 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.634100 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.652565 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.664628 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.679174 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.684110 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.684172 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.684191 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.684218 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.684239 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:49Z","lastTransitionTime":"2025-10-03T14:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.693932 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.707323 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.725485 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.743066 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16e8c0249ee2448082160abe417f67a17fac176add76d00dd5e755c0f9a6a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.754105 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.773896 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.786643 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a01f2ab-7f7c-411c-b424-0d382dee6976\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b65b48ad13df81dd9ccb04fd8aeebbf976409d5d318f51a923c2dfc0cd8cc7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3803c97bfaf2ffb04a8a6924cb97ee022c7d34a38e1a50ce63a7b8f062fc901d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jlwtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.787694 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.787806 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.787889 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.787980 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.788062 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:49Z","lastTransitionTime":"2025-10-03T14:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.797883 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ghf5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d3c89f-9fbd-4d50-840a-c5c78528c903\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ghf5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.810485 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.826674 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://686819b674279f59619905ac4ba451ca79c97fc652a773307eca3fad2352ddda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.854903 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd06fa447f9a388f7cabafaff05c8d6dc9aedc161d2097f81855318c8b7712e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b330d308f14c0b3b3630e89fe3e43728e17dbc3fe78cf9cae365a3ab0474c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b575a360d2ee43bc2629635a4ffda77a1b0e1a171bbd99b76b725784be537e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae8d985dec947c332efa41abbc5c30a978f06942cc62a86ab45e90fc2e5d763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aaa74c70ca8d93d672246f3451771bffa1e304c88a453cbbe3a305a304954fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0d624981ca616d27057de09511f8a17b643686871e2e77c15bd1c65fba9c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32a9edd2bc21611da46c1c1e6a5412feb00ae4b15218128d6ed5e68f906db2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9949e09b836433403be3f3ce9a5226b926ee28efa766add03a7d0f8cef39fb0b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"message\\\":\\\"1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 14:43:41.448023 6091 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1003 14:43:41.448068 6091 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1003 14:43:41.448186 6091 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1003 14:43:41.448517 6091 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1003 14:43:41.448572 6091 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1003 14:43:41.448578 6091 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1003 14:43:41.448595 6091 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 14:43:41.448615 6091 factory.go:656] Stopping watch factory\\\\nI1003 14:43:41.448627 6091 ovnkube.go:599] Stopped ovnkube\\\\nI1003 14:43:41.448661 6091 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 14:43:41.448662 6091 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 14:43:41.448676 6091 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32a9edd2bc21611da46c1c1e6a5412feb00ae4b15218128d6ed5e68f906db2c1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"message\\\":\\\" []services.lbConfig(nil)\\\\nI1003 14:43:42.614795 6308 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1003 14:43:42.614886 6308 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1003 14:43:42.616482 6308 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1003 14:43:42.616487 6308 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1003 14:43:42.614714 6308 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1003 14:43:42.614799 6308 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac1b1bb4b30aa1d11558ff126ad87e4dd7b9aaf255cd4a1860aacdee30a0bb15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.891192 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.891251 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.891270 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.891294 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.891312 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:49Z","lastTransitionTime":"2025-10-03T14:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.993838 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.993878 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.993887 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.993902 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:49 crc kubenswrapper[4774]: I1003 14:43:49.993912 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:49Z","lastTransitionTime":"2025-10-03T14:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.097222 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.097282 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.097292 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.097314 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.097325 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:50Z","lastTransitionTime":"2025-10-03T14:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.200757 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.200814 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.200827 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.200851 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.200868 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:50Z","lastTransitionTime":"2025-10-03T14:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.298882 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.298978 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:43:50 crc kubenswrapper[4774]: E1003 14:43:50.299038 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.299055 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:43:50 crc kubenswrapper[4774]: E1003 14:43:50.299202 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:43:50 crc kubenswrapper[4774]: E1003 14:43:50.299399 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.304176 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.304217 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.304226 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.304245 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.304259 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:50Z","lastTransitionTime":"2025-10-03T14:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.407060 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.407129 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.407149 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.407178 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.407202 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:50Z","lastTransitionTime":"2025-10-03T14:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.509998 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.510073 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.510099 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.510130 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.510147 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:50Z","lastTransitionTime":"2025-10-03T14:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.612656 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.612690 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.612700 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.612714 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.612725 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:50Z","lastTransitionTime":"2025-10-03T14:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.715987 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.716033 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.716044 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.716064 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.716076 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:50Z","lastTransitionTime":"2025-10-03T14:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.819036 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.819077 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.819087 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.819102 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.819111 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:50Z","lastTransitionTime":"2025-10-03T14:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.922726 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.923086 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.923297 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.923510 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:50 crc kubenswrapper[4774]: I1003 14:43:50.923697 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:50Z","lastTransitionTime":"2025-10-03T14:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.026535 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.026590 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.026608 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.026652 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.026673 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:51Z","lastTransitionTime":"2025-10-03T14:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.129429 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.129472 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.129483 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.129499 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.129511 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:51Z","lastTransitionTime":"2025-10-03T14:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.233491 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.233575 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.233604 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.233637 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.233665 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:51Z","lastTransitionTime":"2025-10-03T14:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.299190 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:43:51 crc kubenswrapper[4774]: E1003 14:43:51.299426 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.337033 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.337460 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.337569 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.337687 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.337797 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:51Z","lastTransitionTime":"2025-10-03T14:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.441049 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.441089 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.441100 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.441117 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.441128 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:51Z","lastTransitionTime":"2025-10-03T14:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.544081 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.544365 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.544477 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.544570 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.544656 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:51Z","lastTransitionTime":"2025-10-03T14:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.646672 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.646719 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.646730 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.646747 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.646757 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:51Z","lastTransitionTime":"2025-10-03T14:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.750162 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.750326 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.750348 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.750406 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.750452 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:51Z","lastTransitionTime":"2025-10-03T14:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.854058 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.854146 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.854158 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.854178 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.854193 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:51Z","lastTransitionTime":"2025-10-03T14:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.956783 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.956832 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.956844 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.956859 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:51 crc kubenswrapper[4774]: I1003 14:43:51.956870 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:51Z","lastTransitionTime":"2025-10-03T14:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.059897 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.059932 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.059944 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.059960 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.059973 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:52Z","lastTransitionTime":"2025-10-03T14:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.162622 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.162664 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.162676 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.162692 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.162704 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:52Z","lastTransitionTime":"2025-10-03T14:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.265350 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.265461 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.265484 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.265511 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.265528 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:52Z","lastTransitionTime":"2025-10-03T14:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.298702 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.298772 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:43:52 crc kubenswrapper[4774]: E1003 14:43:52.298861 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:43:52 crc kubenswrapper[4774]: E1003 14:43:52.298995 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.298721 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:43:52 crc kubenswrapper[4774]: E1003 14:43:52.299082 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.368456 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.368528 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.368552 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.368581 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.368604 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:52Z","lastTransitionTime":"2025-10-03T14:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.471442 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.471499 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.471517 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.471545 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.471568 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:52Z","lastTransitionTime":"2025-10-03T14:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.574879 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.574931 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.574943 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.574960 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.574973 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:52Z","lastTransitionTime":"2025-10-03T14:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.677581 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.677640 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.677655 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.677674 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.677692 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:52Z","lastTransitionTime":"2025-10-03T14:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.713889 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.714825 4774 scope.go:117] "RemoveContainer" containerID="32a9edd2bc21611da46c1c1e6a5412feb00ae4b15218128d6ed5e68f906db2c1" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.740173 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:52Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.757492 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:52Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.772489 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:52Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.779575 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.779606 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.779618 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.779633 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.779643 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:52Z","lastTransitionTime":"2025-10-03T14:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.789064 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:52Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.805604 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:52Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.820573 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:52Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.846529 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:52Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.864708 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:52Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.876220 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:52Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.881953 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.882028 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.882054 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.882109 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.882135 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:52Z","lastTransitionTime":"2025-10-03T14:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.898107 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:52Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.920047 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16e8c0249ee2448082160abe417f67a17fac176add76d00dd5e755c0f9a6a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:52Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.936171 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:52Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.948945 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a01f2ab-7f7c-411c-b424-0d382dee6976\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b65b48ad13df81dd9ccb04fd8aeebbf976409d5d318f51a923c2dfc0cd8cc7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3803c97bfaf2ffb04a8a6924cb97ee022c7d34a38e1a50ce63a7b8f062fc901d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jlwtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:52Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.966943 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:52Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.985302 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.985361 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.985416 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.985447 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.985469 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:52Z","lastTransitionTime":"2025-10-03T14:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:52 crc kubenswrapper[4774]: I1003 14:43:52.991859 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://686819b674279f59619905ac4ba451ca79c97fc652a773307eca3fad2352ddda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:52Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.023813 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd06fa447f9a388f7cabafaff05c8d6dc9aedc161d2097f81855318c8b7712e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b330d308f14c0b3b3630e89fe3e43728e17dbc3fe78cf9cae365a3ab0474c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b575a360d2ee43bc2629635a4ffda77a1b0e1a171bbd99b76b725784be537e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae8d985dec947c332efa41abbc5c30a978f06942cc62a86ab45e90fc2e5d763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aaa74c70ca8d93d672246f3451771bffa1e304c88a453cbbe3a305a304954fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0d624981ca616d27057de09511f8a17b643686871e2e77c15bd1c65fba9c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32a9edd2bc21611da46c1c1e6a5412feb00ae4b15218128d6ed5e68f906db2c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32a9edd2bc21611da46c1c1e6a5412feb00ae4b15218128d6ed5e68f906db2c1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"message\\\":\\\" []services.lbConfig(nil)\\\\nI1003 14:43:42.614795 6308 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1003 14:43:42.614886 6308 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1003 14:43:42.616482 6308 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1003 14:43:42.616487 6308 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1003 14:43:42.614714 6308 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1003 14:43:42.614799 6308 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jzv75_openshift-ovn-kubernetes(01bef0c3-23b3-4f49-8c33-3f2ec7503b12)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac1b1bb4b30aa1d11558ff126ad87e4dd7b9aaf255cd4a1860aacdee30a0bb15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:53Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.041148 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ghf5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d3c89f-9fbd-4d50-840a-c5c78528c903\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ghf5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:53Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.043494 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.043551 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.043570 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.043595 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.043612 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:53Z","lastTransitionTime":"2025-10-03T14:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:53 crc kubenswrapper[4774]: E1003 14:43:53.064920 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:53Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.070353 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.070401 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.070415 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.070431 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.070443 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:53Z","lastTransitionTime":"2025-10-03T14:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:53 crc kubenswrapper[4774]: E1003 14:43:53.084344 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:53Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.089307 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.089336 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.089346 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.089361 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.089389 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:53Z","lastTransitionTime":"2025-10-03T14:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:53 crc kubenswrapper[4774]: E1003 14:43:53.106111 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:53Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.110015 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.110042 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.110053 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.110067 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.110078 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:53Z","lastTransitionTime":"2025-10-03T14:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:53 crc kubenswrapper[4774]: E1003 14:43:53.122236 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:53Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.127569 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.127601 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.127617 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.127639 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.127655 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:53Z","lastTransitionTime":"2025-10-03T14:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:53 crc kubenswrapper[4774]: E1003 14:43:53.143798 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:53Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:53 crc kubenswrapper[4774]: E1003 14:43:53.144000 4774 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.145574 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.145601 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.145611 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.145627 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.145638 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:53Z","lastTransitionTime":"2025-10-03T14:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.248064 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.248096 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.248107 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.248122 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.248134 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:53Z","lastTransitionTime":"2025-10-03T14:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.302309 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:43:53 crc kubenswrapper[4774]: E1003 14:43:53.302479 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.350985 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.351023 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.351037 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.351057 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.351072 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:53Z","lastTransitionTime":"2025-10-03T14:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.453165 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.453202 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.453212 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.453227 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.453238 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:53Z","lastTransitionTime":"2025-10-03T14:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.555352 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.555490 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.555510 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.555537 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.555557 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:53Z","lastTransitionTime":"2025-10-03T14:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.642089 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzv75_01bef0c3-23b3-4f49-8c33-3f2ec7503b12/ovnkube-controller/1.log" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.644948 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" event={"ID":"01bef0c3-23b3-4f49-8c33-3f2ec7503b12","Type":"ContainerStarted","Data":"d8a4afd85c0d75edd3184845b7cd816499325c7f7ecef0f46b961419dfe6dd86"} Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.645424 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.657612 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.657656 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.657668 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.657687 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.657699 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:53Z","lastTransitionTime":"2025-10-03T14:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.660913 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a01f2ab-7f7c-411c-b424-0d382dee6976\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b65b48ad13df81dd9ccb04fd8aeebbf976409d5d318f51a923c2dfc0cd8cc7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3803c97bfaf2ffb04a8a6924cb97ee022c7d34a38e1a50ce63a7b8f062fc901d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jlwtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:53Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.681546 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://686819b674279f59619905ac4ba451ca79c97fc652a773307eca3fad2352ddda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:53Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.700855 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd06fa447f9a388f7cabafaff05c8d6dc9aedc161d2097f81855318c8b7712e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b330d308f14c0b3b3630e89fe3e43728e17dbc3fe78cf9cae365a3ab0474c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b575a360d2ee43bc2629635a4ffda77a1b0e1a171bbd99b76b725784be537e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae8d985dec947c332efa41abbc5c30a978f06942cc62a86ab45e90fc2e5d763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aaa74c70ca8d93d672246f3451771bffa1e304c88a453cbbe3a305a304954fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0d624981ca616d27057de09511f8a17b643686871e2e77c15bd1c65fba9c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8a4afd85c0d75edd3184845b7cd816499325c7f7ecef0f46b961419dfe6dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32a9edd2bc21611da46c1c1e6a5412feb00ae4b15218128d6ed5e68f906db2c1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"message\\\":\\\" []services.lbConfig(nil)\\\\nI1003 14:43:42.614795 6308 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1003 14:43:42.614886 6308 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1003 14:43:42.616482 6308 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1003 14:43:42.616487 6308 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1003 14:43:42.614714 6308 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1003 14:43:42.614799 6308 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac1b1bb4b30aa1d11558ff126ad87e4dd7b9aaf255cd4a1860aacdee30a0bb15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:53Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.717238 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ghf5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d3c89f-9fbd-4d50-840a-c5c78528c903\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ghf5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:53Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.740711 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:53Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.759630 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:53Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.761075 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.761131 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.761147 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.761169 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.761184 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:53Z","lastTransitionTime":"2025-10-03T14:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.773495 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:53Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.784449 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:53Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.794396 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:53Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.808261 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:53Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.821572 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:53Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.843057 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:53Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.856346 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:53Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.863715 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.863753 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.863772 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.863793 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.863808 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:53Z","lastTransitionTime":"2025-10-03T14:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.868147 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:53Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.883514 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:53Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.910585 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16e8c0249ee2448082160abe417f67a17fac176add76d00dd5e755c0f9a6a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:53Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.925280 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:53Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.960681 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:43:53 crc kubenswrapper[4774]: E1003 14:43:53.960837 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:44:25.96081998 +0000 UTC m=+88.550023432 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.966310 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.966348 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.966358 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.966407 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:53 crc kubenswrapper[4774]: I1003 14:43:53.966420 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:53Z","lastTransitionTime":"2025-10-03T14:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.062640 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.062783 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.062854 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.062906 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:43:54 crc kubenswrapper[4774]: E1003 14:43:54.062937 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 14:43:54 crc kubenswrapper[4774]: E1003 14:43:54.062982 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 14:43:54 crc kubenswrapper[4774]: E1003 14:43:54.063004 4774 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:43:54 crc kubenswrapper[4774]: E1003 14:43:54.063073 4774 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 14:43:54 crc kubenswrapper[4774]: E1003 14:43:54.063071 4774 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 14:43:54 crc kubenswrapper[4774]: E1003 14:43:54.063087 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 14:44:26.06306068 +0000 UTC m=+88.652264172 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:43:54 crc kubenswrapper[4774]: E1003 14:43:54.063222 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 14:44:26.063195823 +0000 UTC m=+88.652399315 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 14:43:54 crc kubenswrapper[4774]: E1003 14:43:54.063263 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 14:44:26.063249295 +0000 UTC m=+88.652452787 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 14:43:54 crc kubenswrapper[4774]: E1003 14:43:54.063297 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 14:43:54 crc kubenswrapper[4774]: E1003 14:43:54.063336 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 14:43:54 crc kubenswrapper[4774]: E1003 14:43:54.063364 4774 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:43:54 crc kubenswrapper[4774]: E1003 14:43:54.063488 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 14:44:26.0634661 +0000 UTC m=+88.652669582 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.069135 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.069226 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.069248 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.069275 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.069293 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:54Z","lastTransitionTime":"2025-10-03T14:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.173094 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.173123 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.173133 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.173147 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.173157 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:54Z","lastTransitionTime":"2025-10-03T14:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.276353 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.276402 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.276413 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.276429 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.276440 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:54Z","lastTransitionTime":"2025-10-03T14:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.298957 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.299000 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.299041 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:43:54 crc kubenswrapper[4774]: E1003 14:43:54.299087 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:43:54 crc kubenswrapper[4774]: E1003 14:43:54.299221 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:43:54 crc kubenswrapper[4774]: E1003 14:43:54.299399 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.378772 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.378815 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.378826 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.378844 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.378856 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:54Z","lastTransitionTime":"2025-10-03T14:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.481785 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.481846 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.481868 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.481894 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.481913 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:54Z","lastTransitionTime":"2025-10-03T14:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.584298 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.584353 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.584366 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.584395 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.584407 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:54Z","lastTransitionTime":"2025-10-03T14:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.650438 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzv75_01bef0c3-23b3-4f49-8c33-3f2ec7503b12/ovnkube-controller/2.log" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.651167 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzv75_01bef0c3-23b3-4f49-8c33-3f2ec7503b12/ovnkube-controller/1.log" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.653636 4774 generic.go:334] "Generic (PLEG): container finished" podID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerID="d8a4afd85c0d75edd3184845b7cd816499325c7f7ecef0f46b961419dfe6dd86" exitCode=1 Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.653667 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" event={"ID":"01bef0c3-23b3-4f49-8c33-3f2ec7503b12","Type":"ContainerDied","Data":"d8a4afd85c0d75edd3184845b7cd816499325c7f7ecef0f46b961419dfe6dd86"} Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.653707 4774 scope.go:117] "RemoveContainer" containerID="32a9edd2bc21611da46c1c1e6a5412feb00ae4b15218128d6ed5e68f906db2c1" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.654557 4774 scope.go:117] "RemoveContainer" containerID="d8a4afd85c0d75edd3184845b7cd816499325c7f7ecef0f46b961419dfe6dd86" Oct 03 14:43:54 crc kubenswrapper[4774]: E1003 14:43:54.654756 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jzv75_openshift-ovn-kubernetes(01bef0c3-23b3-4f49-8c33-3f2ec7503b12)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.674612 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:54Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.686790 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.686830 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.686843 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.686865 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.686877 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:54Z","lastTransitionTime":"2025-10-03T14:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.687536 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:54Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.698688 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:54Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.709897 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:54Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.723669 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:54Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.743848 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:54Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.773329 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:54Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.787685 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:54Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.788885 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.788922 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.788934 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.788955 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.788978 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:54Z","lastTransitionTime":"2025-10-03T14:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.799584 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:54Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.812527 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:54Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.826293 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16e8c0249ee2448082160abe417f67a17fac176add76d00dd5e755c0f9a6a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:54Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.837981 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:54Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.848137 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a01f2ab-7f7c-411c-b424-0d382dee6976\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b65b48ad13df81dd9ccb04fd8aeebbf976409d5d318f51a923c2dfc0cd8cc7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3803c97bfaf2ffb04a8a6924cb97ee022c7d34a38e1a50ce63a7b8f062fc901d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jlwtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:54Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.870210 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd06fa447f9a388f7cabafaff05c8d6dc9aedc161d2097f81855318c8b7712e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b330d308f14c0b3b3630e89fe3e43728e17dbc3fe78cf9cae365a3ab0474c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b575a360d2ee43bc2629635a4ffda77a1b0e1a171bbd99b76b725784be537e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae8d985dec947c332efa41abbc5c30a978f06942cc62a86ab45e90fc2e5d763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aaa74c70ca8d93d672246f3451771bffa1e304c88a453cbbe3a305a304954fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0d624981ca616d27057de09511f8a17b643686871e2e77c15bd1c65fba9c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8a4afd85c0d75edd3184845b7cd816499325c7f7ecef0f46b961419dfe6dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32a9edd2bc21611da46c1c1e6a5412feb00ae4b15218128d6ed5e68f906db2c1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:43:42Z\\\",\\\"message\\\":\\\" []services.lbConfig(nil)\\\\nI1003 14:43:42.614795 6308 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1003 14:43:42.614886 6308 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1003 14:43:42.616482 6308 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1003 14:43:42.616487 6308 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1003 14:43:42.614714 6308 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1003 14:43:42.614799 6308 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8a4afd85c0d75edd3184845b7cd816499325c7f7ecef0f46b961419dfe6dd86\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 14:43:53.739938 6460 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1003 14:43:53.739975 6460 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1003 14:43:53.739997 6460 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1003 14:43:53.740056 6460 factory.go:1336] Added *v1.Node event handler 7\\\\nI1003 14:43:53.740098 6460 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1003 14:43:53.740599 6460 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1003 14:43:53.740805 6460 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1003 14:43:53.740883 6460 ovnkube.go:599] Stopped ovnkube\\\\nI1003 14:43:53.740916 6460 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 14:43:53.741090 6460 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac1b1bb4b30aa1d11558ff126ad87e4dd7b9aaf255cd4a1860aacdee30a0bb15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:54Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.887731 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ghf5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d3c89f-9fbd-4d50-840a-c5c78528c903\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ghf5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:54Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.893595 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.893643 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.893659 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.893681 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.893702 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:54Z","lastTransitionTime":"2025-10-03T14:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.903568 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:54Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.918458 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://686819b674279f59619905ac4ba451ca79c97fc652a773307eca3fad2352ddda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:54Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.996199 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.996254 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.996263 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.996280 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:54 crc kubenswrapper[4774]: I1003 14:43:54.996289 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:54Z","lastTransitionTime":"2025-10-03T14:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.098641 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.098673 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.098681 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.098694 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.098702 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:55Z","lastTransitionTime":"2025-10-03T14:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.200751 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.201095 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.201114 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.201139 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.201161 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:55Z","lastTransitionTime":"2025-10-03T14:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.299230 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:43:55 crc kubenswrapper[4774]: E1003 14:43:55.299508 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.304654 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.304710 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.304736 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.304774 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.304800 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:55Z","lastTransitionTime":"2025-10-03T14:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.407517 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.407591 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.407614 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.407642 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.407660 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:55Z","lastTransitionTime":"2025-10-03T14:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.509399 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.509447 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.509461 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.509481 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.509497 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:55Z","lastTransitionTime":"2025-10-03T14:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.612093 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.612128 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.612136 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.612149 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.612159 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:55Z","lastTransitionTime":"2025-10-03T14:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.661207 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzv75_01bef0c3-23b3-4f49-8c33-3f2ec7503b12/ovnkube-controller/2.log" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.667984 4774 scope.go:117] "RemoveContainer" containerID="d8a4afd85c0d75edd3184845b7cd816499325c7f7ecef0f46b961419dfe6dd86" Oct 03 14:43:55 crc kubenswrapper[4774]: E1003 14:43:55.668339 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jzv75_openshift-ovn-kubernetes(01bef0c3-23b3-4f49-8c33-3f2ec7503b12)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.689404 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:55Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.709622 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:55Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.718213 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.718260 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.718272 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.718289 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.718301 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:55Z","lastTransitionTime":"2025-10-03T14:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.724895 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:55Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.737272 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:55Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.751022 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:55Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.764253 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:55Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.785936 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16e8c0249ee2448082160abe417f67a17fac176add76d00dd5e755c0f9a6a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:55Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.799255 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:55Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.820519 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.820552 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.820561 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.820573 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.820581 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:55Z","lastTransitionTime":"2025-10-03T14:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.825518 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:55Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.842100 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:55Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.852747 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:55Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.868562 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:55Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.885528 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a01f2ab-7f7c-411c-b424-0d382dee6976\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b65b48ad13df81dd9ccb04fd8aeebbf976409d5d318f51a923c2dfc0cd8cc7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3803c97bfaf2ffb04a8a6924cb97ee022c7d34a38e1a50ce63a7b8f062fc901d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jlwtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:55Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.901764 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:55Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.920244 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://686819b674279f59619905ac4ba451ca79c97fc652a773307eca3fad2352ddda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:55Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.923031 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.923096 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.923111 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.923129 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.923143 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:55Z","lastTransitionTime":"2025-10-03T14:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.949485 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd06fa447f9a388f7cabafaff05c8d6dc9aedc161d2097f81855318c8b7712e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b330d308f14c0b3b3630e89fe3e43728e17dbc3fe78cf9cae365a3ab0474c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b575a360d2ee43bc2629635a4ffda77a1b0e1a171bbd99b76b725784be537e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae8d985dec947c332efa41abbc5c30a978f06942cc62a86ab45e90fc2e5d763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aaa74c70ca8d93d672246f3451771bffa1e304c88a453cbbe3a305a304954fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0d624981ca616d27057de09511f8a17b643686871e2e77c15bd1c65fba9c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8a4afd85c0d75edd3184845b7cd816499325c7f7ecef0f46b961419dfe6dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8a4afd85c0d75edd3184845b7cd816499325c7f7ecef0f46b961419dfe6dd86\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 14:43:53.739938 6460 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1003 14:43:53.739975 6460 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1003 14:43:53.739997 6460 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1003 14:43:53.740056 6460 factory.go:1336] Added *v1.Node event handler 7\\\\nI1003 14:43:53.740098 6460 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1003 14:43:53.740599 6460 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1003 14:43:53.740805 6460 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1003 14:43:53.740883 6460 ovnkube.go:599] Stopped ovnkube\\\\nI1003 14:43:53.740916 6460 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 14:43:53.741090 6460 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jzv75_openshift-ovn-kubernetes(01bef0c3-23b3-4f49-8c33-3f2ec7503b12)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac1b1bb4b30aa1d11558ff126ad87e4dd7b9aaf255cd4a1860aacdee30a0bb15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:55Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.961820 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ghf5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d3c89f-9fbd-4d50-840a-c5c78528c903\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ghf5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:55Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:55 crc kubenswrapper[4774]: I1003 14:43:55.982576 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88d3c89f-9fbd-4d50-840a-c5c78528c903-metrics-certs\") pod \"network-metrics-daemon-ghf5t\" (UID: \"88d3c89f-9fbd-4d50-840a-c5c78528c903\") " pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:43:55 crc kubenswrapper[4774]: E1003 14:43:55.982741 4774 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 14:43:55 crc kubenswrapper[4774]: E1003 14:43:55.982952 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88d3c89f-9fbd-4d50-840a-c5c78528c903-metrics-certs podName:88d3c89f-9fbd-4d50-840a-c5c78528c903 nodeName:}" failed. No retries permitted until 2025-10-03 14:44:11.982927603 +0000 UTC m=+74.572131095 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/88d3c89f-9fbd-4d50-840a-c5c78528c903-metrics-certs") pod "network-metrics-daemon-ghf5t" (UID: "88d3c89f-9fbd-4d50-840a-c5c78528c903") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.025833 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.025881 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.025894 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.025910 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.025919 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:56Z","lastTransitionTime":"2025-10-03T14:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.128211 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.128248 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.128258 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.128272 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.128282 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:56Z","lastTransitionTime":"2025-10-03T14:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.231030 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.231086 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.231107 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.231129 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.231145 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:56Z","lastTransitionTime":"2025-10-03T14:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.299229 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.299268 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.299304 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:43:56 crc kubenswrapper[4774]: E1003 14:43:56.299359 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:43:56 crc kubenswrapper[4774]: E1003 14:43:56.299534 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:43:56 crc kubenswrapper[4774]: E1003 14:43:56.299697 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.334362 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.334420 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.334432 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.334449 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.334463 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:56Z","lastTransitionTime":"2025-10-03T14:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.437156 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.437212 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.437229 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.437250 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.437268 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:56Z","lastTransitionTime":"2025-10-03T14:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.542210 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.542256 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.542269 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.542287 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.542300 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:56Z","lastTransitionTime":"2025-10-03T14:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.644855 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.644892 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.644901 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.644916 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.644927 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:56Z","lastTransitionTime":"2025-10-03T14:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.747702 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.747752 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.747765 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.747784 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.747797 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:56Z","lastTransitionTime":"2025-10-03T14:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.823655 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.841441 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:56Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.841716 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.850691 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.850725 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.850738 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.850754 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.850766 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:56Z","lastTransitionTime":"2025-10-03T14:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.860324 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:56Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.874631 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16e8c0249ee2448082160abe417f67a17fac176add76d00dd5e755c0f9a6a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:56Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.884546 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:56Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.902153 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:56Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.915823 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:56Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.928859 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a01f2ab-7f7c-411c-b424-0d382dee6976\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b65b48ad13df81dd9ccb04fd8aeebbf976409d5d318f51a923c2dfc0cd8cc7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3803c97bfaf2ffb04a8a6924cb97ee022c7d34a38e1a50ce63a7b8f062fc901d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jlwtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:56Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.941347 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:56Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.953279 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.953316 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.953325 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.953338 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.953346 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:56Z","lastTransitionTime":"2025-10-03T14:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.954320 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://686819b674279f59619905ac4ba451ca79c97fc652a773307eca3fad2352ddda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:56Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.970182 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd06fa447f9a388f7cabafaff05c8d6dc9aedc161d2097f81855318c8b7712e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b330d308f14c0b3b3630e89fe3e43728e17dbc3fe78cf9cae365a3ab0474c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b575a360d2ee43bc2629635a4ffda77a1b0e1a171bbd99b76b725784be537e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae8d985dec947c332efa41abbc5c30a978f06942cc62a86ab45e90fc2e5d763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aaa74c70ca8d93d672246f3451771bffa1e304c88a453cbbe3a305a304954fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0d624981ca616d27057de09511f8a17b643686871e2e77c15bd1c65fba9c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8a4afd85c0d75edd3184845b7cd816499325c7f7ecef0f46b961419dfe6dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8a4afd85c0d75edd3184845b7cd816499325c7f7ecef0f46b961419dfe6dd86\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 14:43:53.739938 6460 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1003 14:43:53.739975 6460 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1003 14:43:53.739997 6460 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1003 14:43:53.740056 6460 factory.go:1336] Added *v1.Node event handler 7\\\\nI1003 14:43:53.740098 6460 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1003 14:43:53.740599 6460 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1003 14:43:53.740805 6460 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1003 14:43:53.740883 6460 ovnkube.go:599] Stopped ovnkube\\\\nI1003 14:43:53.740916 6460 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 14:43:53.741090 6460 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jzv75_openshift-ovn-kubernetes(01bef0c3-23b3-4f49-8c33-3f2ec7503b12)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac1b1bb4b30aa1d11558ff126ad87e4dd7b9aaf255cd4a1860aacdee30a0bb15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:56Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.979705 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ghf5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d3c89f-9fbd-4d50-840a-c5c78528c903\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ghf5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:56Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:56 crc kubenswrapper[4774]: I1003 14:43:56.997849 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:56Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.009399 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:57Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.024446 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:57Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.041133 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:57Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.054903 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:57Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.056016 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.056064 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.056078 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.056096 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.056110 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:57Z","lastTransitionTime":"2025-10-03T14:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.067633 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:57Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.158715 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.158772 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.158790 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.158815 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.158834 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:57Z","lastTransitionTime":"2025-10-03T14:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.261987 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.262033 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.262046 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.262105 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.262121 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:57Z","lastTransitionTime":"2025-10-03T14:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.299110 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:43:57 crc kubenswrapper[4774]: E1003 14:43:57.299454 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.364510 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.364548 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.364558 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.364572 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.364581 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:57Z","lastTransitionTime":"2025-10-03T14:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.467120 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.467164 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.467174 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.467187 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.467195 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:57Z","lastTransitionTime":"2025-10-03T14:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.569946 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.569993 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.570010 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.570025 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.570035 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:57Z","lastTransitionTime":"2025-10-03T14:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.671696 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.671743 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.671754 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.671798 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.671810 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:57Z","lastTransitionTime":"2025-10-03T14:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.774497 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.774530 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.774538 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.774552 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.774560 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:57Z","lastTransitionTime":"2025-10-03T14:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.878118 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.878202 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.878225 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.878296 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.878318 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:57Z","lastTransitionTime":"2025-10-03T14:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.981210 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.981276 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.981294 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.981317 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:57 crc kubenswrapper[4774]: I1003 14:43:57.981335 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:57Z","lastTransitionTime":"2025-10-03T14:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.083620 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.083682 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.083695 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.083712 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.083723 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:58Z","lastTransitionTime":"2025-10-03T14:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.187070 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.187139 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.187153 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.187180 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.187196 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:58Z","lastTransitionTime":"2025-10-03T14:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.289861 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.289988 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.290066 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.290099 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.290119 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:58Z","lastTransitionTime":"2025-10-03T14:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.298671 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.298727 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.298697 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:43:58 crc kubenswrapper[4774]: E1003 14:43:58.298934 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:43:58 crc kubenswrapper[4774]: E1003 14:43:58.299043 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:43:58 crc kubenswrapper[4774]: E1003 14:43:58.299178 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.392643 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.392726 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.392737 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.392751 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.392761 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:58Z","lastTransitionTime":"2025-10-03T14:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.494957 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.495003 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.495017 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.495034 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.495046 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:58Z","lastTransitionTime":"2025-10-03T14:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.597852 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.597887 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.597896 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.597909 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.597917 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:58Z","lastTransitionTime":"2025-10-03T14:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.700522 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.700564 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.700574 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.700588 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.700598 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:58Z","lastTransitionTime":"2025-10-03T14:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.804178 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.804249 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.804273 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.804302 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.804339 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:58Z","lastTransitionTime":"2025-10-03T14:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.907335 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.907438 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.907459 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.907481 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:58 crc kubenswrapper[4774]: I1003 14:43:58.907537 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:58Z","lastTransitionTime":"2025-10-03T14:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.010206 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.010271 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.010289 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.010313 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.010331 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:59Z","lastTransitionTime":"2025-10-03T14:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.113281 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.113314 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.113325 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.113352 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.113363 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:59Z","lastTransitionTime":"2025-10-03T14:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.215781 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.215840 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.215853 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.215872 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.215884 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:59Z","lastTransitionTime":"2025-10-03T14:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.299746 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:43:59 crc kubenswrapper[4774]: E1003 14:43:59.300513 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.319733 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.319781 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.319793 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.319813 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.319827 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:59Z","lastTransitionTime":"2025-10-03T14:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.319836 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:59Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.336118 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://686819b674279f59619905ac4ba451ca79c97fc652a773307eca3fad2352ddda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:59Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.357867 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd06fa447f9a388f7cabafaff05c8d6dc9aedc161d2097f81855318c8b7712e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b330d308f14c0b3b3630e89fe3e43728e17dbc3fe78cf9cae365a3ab0474c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b575a360d2ee43bc2629635a4ffda77a1b0e1a171bbd99b76b725784be537e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae8d985dec947c332efa41abbc5c30a978f06942cc62a86ab45e90fc2e5d763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aaa74c70ca8d93d672246f3451771bffa1e304c88a453cbbe3a305a304954fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0d624981ca616d27057de09511f8a17b643686871e2e77c15bd1c65fba9c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8a4afd85c0d75edd3184845b7cd816499325c7f7ecef0f46b961419dfe6dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8a4afd85c0d75edd3184845b7cd816499325c7f7ecef0f46b961419dfe6dd86\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 14:43:53.739938 6460 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1003 14:43:53.739975 6460 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1003 14:43:53.739997 6460 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1003 14:43:53.740056 6460 factory.go:1336] Added *v1.Node event handler 7\\\\nI1003 14:43:53.740098 6460 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1003 14:43:53.740599 6460 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1003 14:43:53.740805 6460 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1003 14:43:53.740883 6460 ovnkube.go:599] Stopped ovnkube\\\\nI1003 14:43:53.740916 6460 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 14:43:53.741090 6460 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jzv75_openshift-ovn-kubernetes(01bef0c3-23b3-4f49-8c33-3f2ec7503b12)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac1b1bb4b30aa1d11558ff126ad87e4dd7b9aaf255cd4a1860aacdee30a0bb15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:59Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.369857 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ghf5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d3c89f-9fbd-4d50-840a-c5c78528c903\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ghf5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:59Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.386047 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:59Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.401917 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:59Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.415427 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:59Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.422470 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.422514 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.422528 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.422554 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.422566 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:59Z","lastTransitionTime":"2025-10-03T14:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.429354 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:59Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.442846 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:59Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.458045 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:59Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.473628 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:59Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.489851 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16e8c0249ee2448082160abe417f67a17fac176add76d00dd5e755c0f9a6a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:59Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.501066 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:59Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.513276 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5f4df4f-152d-4e0f-b040-8772ae05ccbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a653e60009bbcb51465ae267a42828fbf1e9863cc50c73ab2f3b490b5f1524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4623e559e31f88609bb4a406374a3ce997e19ca75d71bc291cb98ed209f1e2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e63db8eac87987c01cd81f4cc676a2bb7268ee091097ce8b1ed6a14bc3f2346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://272d3589475ea3f31e989b1c7326b814559a8c6cb5f9e5e042ffba28cb1b7c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://272d3589475ea3f31e989b1c7326b814559a8c6cb5f9e5e042ffba28cb1b7c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:59Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.526180 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.526338 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.526460 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.526556 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.526628 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:59Z","lastTransitionTime":"2025-10-03T14:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.541391 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:59Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.560500 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:59Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.576739 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:59Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.594197 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a01f2ab-7f7c-411c-b424-0d382dee6976\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b65b48ad13df81dd9ccb04fd8aeebbf976409d5d318f51a923c2dfc0cd8cc7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3803c97bfaf2ffb04a8a6924cb97ee022c7d34a38e1a50ce63a7b8f062fc901d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jlwtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:43:59Z is after 2025-08-24T17:21:41Z" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.628741 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.628787 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.628804 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.628820 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.628829 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:59Z","lastTransitionTime":"2025-10-03T14:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.731687 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.731920 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.732002 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.732168 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.732276 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:59Z","lastTransitionTime":"2025-10-03T14:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.835194 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.835691 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.835774 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.835894 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.835981 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:59Z","lastTransitionTime":"2025-10-03T14:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.938800 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.938840 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.938852 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.938873 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:43:59 crc kubenswrapper[4774]: I1003 14:43:59.938885 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:43:59Z","lastTransitionTime":"2025-10-03T14:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.041118 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.041491 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.041639 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.041768 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.041887 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:00Z","lastTransitionTime":"2025-10-03T14:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.145045 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.145426 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.145629 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.145828 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.146017 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:00Z","lastTransitionTime":"2025-10-03T14:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.248866 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.248904 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.248912 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.248928 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.248938 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:00Z","lastTransitionTime":"2025-10-03T14:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.298499 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.298535 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:44:00 crc kubenswrapper[4774]: E1003 14:44:00.298643 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.298713 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:44:00 crc kubenswrapper[4774]: E1003 14:44:00.298831 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:44:00 crc kubenswrapper[4774]: E1003 14:44:00.298955 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.351664 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.351985 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.352183 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.352591 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.352772 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:00Z","lastTransitionTime":"2025-10-03T14:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.455254 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.455648 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.455840 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.456062 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.456259 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:00Z","lastTransitionTime":"2025-10-03T14:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.559543 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.559588 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.559600 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.559616 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.559629 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:00Z","lastTransitionTime":"2025-10-03T14:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.661628 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.661689 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.661699 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.661714 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.661727 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:00Z","lastTransitionTime":"2025-10-03T14:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.764779 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.764849 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.764868 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.764893 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.764911 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:00Z","lastTransitionTime":"2025-10-03T14:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.867887 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.868177 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.868294 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.868487 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.868605 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:00Z","lastTransitionTime":"2025-10-03T14:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.972502 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.973339 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.973557 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.973583 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:00 crc kubenswrapper[4774]: I1003 14:44:00.973598 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:00Z","lastTransitionTime":"2025-10-03T14:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.076466 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.076513 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.076526 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.076544 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.076558 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:01Z","lastTransitionTime":"2025-10-03T14:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.178900 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.178970 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.178995 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.179027 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.179047 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:01Z","lastTransitionTime":"2025-10-03T14:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.281261 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.281312 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.281323 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.281343 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.281356 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:01Z","lastTransitionTime":"2025-10-03T14:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.298779 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:44:01 crc kubenswrapper[4774]: E1003 14:44:01.298960 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.383846 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.383897 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.383906 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.383920 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.383940 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:01Z","lastTransitionTime":"2025-10-03T14:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.487108 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.487183 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.487201 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.487227 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.487244 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:01Z","lastTransitionTime":"2025-10-03T14:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.590147 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.590197 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.590208 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.590221 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.590230 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:01Z","lastTransitionTime":"2025-10-03T14:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.692787 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.692824 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.692836 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.692852 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.692864 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:01Z","lastTransitionTime":"2025-10-03T14:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.795498 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.795548 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.795557 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.795572 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.795583 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:01Z","lastTransitionTime":"2025-10-03T14:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.899041 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.899133 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.899145 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.899166 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:01 crc kubenswrapper[4774]: I1003 14:44:01.899187 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:01Z","lastTransitionTime":"2025-10-03T14:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.002408 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.002466 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.002481 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.002504 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.002518 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:02Z","lastTransitionTime":"2025-10-03T14:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.105174 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.105499 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.105513 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.105532 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.105545 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:02Z","lastTransitionTime":"2025-10-03T14:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.208119 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.208168 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.208179 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.208195 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.208206 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:02Z","lastTransitionTime":"2025-10-03T14:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.299444 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.299507 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.299516 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:44:02 crc kubenswrapper[4774]: E1003 14:44:02.299627 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:44:02 crc kubenswrapper[4774]: E1003 14:44:02.299773 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:44:02 crc kubenswrapper[4774]: E1003 14:44:02.299968 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.311277 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.311331 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.311349 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.311398 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.311416 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:02Z","lastTransitionTime":"2025-10-03T14:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.414782 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.414839 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.414851 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.414870 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.414881 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:02Z","lastTransitionTime":"2025-10-03T14:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.517443 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.517484 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.517492 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.517505 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.517514 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:02Z","lastTransitionTime":"2025-10-03T14:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.620536 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.620572 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.620582 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.620596 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.620606 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:02Z","lastTransitionTime":"2025-10-03T14:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.723362 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.723424 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.723434 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.723448 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.723456 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:02Z","lastTransitionTime":"2025-10-03T14:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.826294 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.826355 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.826366 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.826418 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.826430 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:02Z","lastTransitionTime":"2025-10-03T14:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.928764 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.928847 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.928865 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.928889 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:02 crc kubenswrapper[4774]: I1003 14:44:02.928906 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:02Z","lastTransitionTime":"2025-10-03T14:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.031482 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.031542 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.031555 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.031571 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.031583 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:03Z","lastTransitionTime":"2025-10-03T14:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.133826 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.133867 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.133877 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.133896 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.133905 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:03Z","lastTransitionTime":"2025-10-03T14:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.236125 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.236184 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.236199 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.236221 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.236236 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:03Z","lastTransitionTime":"2025-10-03T14:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.244006 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.244074 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.244095 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.244118 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.244134 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:03Z","lastTransitionTime":"2025-10-03T14:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:03 crc kubenswrapper[4774]: E1003 14:44:03.263738 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:03Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.267907 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.267949 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.267958 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.267974 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.267984 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:03Z","lastTransitionTime":"2025-10-03T14:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:03 crc kubenswrapper[4774]: E1003 14:44:03.285271 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:03Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.289590 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.289632 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.289641 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.289654 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.289662 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:03Z","lastTransitionTime":"2025-10-03T14:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.299321 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:44:03 crc kubenswrapper[4774]: E1003 14:44:03.299454 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:44:03 crc kubenswrapper[4774]: E1003 14:44:03.306307 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:03Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.310052 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.310083 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.310094 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.310109 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.310121 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:03Z","lastTransitionTime":"2025-10-03T14:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:03 crc kubenswrapper[4774]: E1003 14:44:03.324280 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:03Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.328569 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.328608 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.328622 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.328640 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.328653 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:03Z","lastTransitionTime":"2025-10-03T14:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:03 crc kubenswrapper[4774]: E1003 14:44:03.343962 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:03Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:03 crc kubenswrapper[4774]: E1003 14:44:03.344118 4774 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.345872 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.345911 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.345922 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.345938 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.345950 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:03Z","lastTransitionTime":"2025-10-03T14:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.448751 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.448786 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.448798 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.448812 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.448821 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:03Z","lastTransitionTime":"2025-10-03T14:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.551098 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.551434 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.551541 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.551626 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.551709 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:03Z","lastTransitionTime":"2025-10-03T14:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.654222 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.654296 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.654320 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.654349 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.654404 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:03Z","lastTransitionTime":"2025-10-03T14:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.756447 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.756476 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.756484 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.756496 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.756504 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:03Z","lastTransitionTime":"2025-10-03T14:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.859105 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.859152 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.859169 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.859192 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.859208 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:03Z","lastTransitionTime":"2025-10-03T14:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.960869 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.960909 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.960921 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.960937 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:03 crc kubenswrapper[4774]: I1003 14:44:03.960949 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:03Z","lastTransitionTime":"2025-10-03T14:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.063352 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.063437 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.063451 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.063469 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.063481 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:04Z","lastTransitionTime":"2025-10-03T14:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.165421 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.165474 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.165484 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.165501 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.165511 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:04Z","lastTransitionTime":"2025-10-03T14:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.268172 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.268222 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.268235 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.268255 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.268267 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:04Z","lastTransitionTime":"2025-10-03T14:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.298791 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.298832 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.298869 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:44:04 crc kubenswrapper[4774]: E1003 14:44:04.298966 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:44:04 crc kubenswrapper[4774]: E1003 14:44:04.299038 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:44:04 crc kubenswrapper[4774]: E1003 14:44:04.299196 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.370518 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.370579 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.370598 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.370623 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.370642 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:04Z","lastTransitionTime":"2025-10-03T14:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.472805 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.472848 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.472859 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.472875 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.472884 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:04Z","lastTransitionTime":"2025-10-03T14:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.575437 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.575511 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.575524 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.575541 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.575554 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:04Z","lastTransitionTime":"2025-10-03T14:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.677780 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.677819 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.677827 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.677841 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.677850 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:04Z","lastTransitionTime":"2025-10-03T14:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.780183 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.780229 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.780239 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.780257 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.780266 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:04Z","lastTransitionTime":"2025-10-03T14:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.882844 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.883072 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.883197 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.883292 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.883402 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:04Z","lastTransitionTime":"2025-10-03T14:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.986069 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.986351 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.986436 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.986504 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:04 crc kubenswrapper[4774]: I1003 14:44:04.986562 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:04Z","lastTransitionTime":"2025-10-03T14:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.089433 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.089466 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.089475 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.089488 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.089496 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:05Z","lastTransitionTime":"2025-10-03T14:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.190993 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.191051 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.191063 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.191082 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.191094 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:05Z","lastTransitionTime":"2025-10-03T14:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.293519 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.293775 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.293856 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.293934 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.294003 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:05Z","lastTransitionTime":"2025-10-03T14:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.298808 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:44:05 crc kubenswrapper[4774]: E1003 14:44:05.298926 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.396241 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.396296 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.396313 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.396331 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.396344 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:05Z","lastTransitionTime":"2025-10-03T14:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.498876 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.498918 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.498930 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.498947 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.498960 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:05Z","lastTransitionTime":"2025-10-03T14:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.601711 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.601919 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.601975 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.602039 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.602097 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:05Z","lastTransitionTime":"2025-10-03T14:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.704464 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.704491 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.704499 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.704510 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.704518 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:05Z","lastTransitionTime":"2025-10-03T14:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.806969 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.807001 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.807013 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.807028 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.807041 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:05Z","lastTransitionTime":"2025-10-03T14:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.909150 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.909213 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.909231 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.909256 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:05 crc kubenswrapper[4774]: I1003 14:44:05.909273 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:05Z","lastTransitionTime":"2025-10-03T14:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.011621 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.011663 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.011671 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.011694 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.011703 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:06Z","lastTransitionTime":"2025-10-03T14:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.114662 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.115253 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.115340 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.115491 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.115575 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:06Z","lastTransitionTime":"2025-10-03T14:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.218449 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.219006 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.219557 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.219773 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.219964 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:06Z","lastTransitionTime":"2025-10-03T14:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.298702 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:44:06 crc kubenswrapper[4774]: E1003 14:44:06.298856 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.299055 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:44:06 crc kubenswrapper[4774]: E1003 14:44:06.299128 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.299441 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:44:06 crc kubenswrapper[4774]: E1003 14:44:06.299733 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.322660 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.322692 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.322705 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.322719 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.322730 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:06Z","lastTransitionTime":"2025-10-03T14:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.424617 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.424853 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.424924 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.424994 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.425091 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:06Z","lastTransitionTime":"2025-10-03T14:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.527981 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.528260 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.528347 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.528460 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.528549 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:06Z","lastTransitionTime":"2025-10-03T14:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.630448 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.630663 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.630739 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.630806 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.630863 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:06Z","lastTransitionTime":"2025-10-03T14:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.732943 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.733000 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.733017 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.733041 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.733058 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:06Z","lastTransitionTime":"2025-10-03T14:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.835429 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.835463 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.835474 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.835492 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.835503 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:06Z","lastTransitionTime":"2025-10-03T14:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.937572 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.937612 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.937625 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.937640 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:06 crc kubenswrapper[4774]: I1003 14:44:06.937651 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:06Z","lastTransitionTime":"2025-10-03T14:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.039573 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.039607 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.039619 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.039666 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.039677 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:07Z","lastTransitionTime":"2025-10-03T14:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.142848 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.142885 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.142898 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.142915 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.142926 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:07Z","lastTransitionTime":"2025-10-03T14:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.245384 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.245419 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.245428 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.245442 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.245450 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:07Z","lastTransitionTime":"2025-10-03T14:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.299283 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:44:07 crc kubenswrapper[4774]: E1003 14:44:07.299909 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.347684 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.347718 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.347728 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.347742 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.347751 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:07Z","lastTransitionTime":"2025-10-03T14:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.449627 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.449669 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.449679 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.449694 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.449705 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:07Z","lastTransitionTime":"2025-10-03T14:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.555407 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.555457 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.555472 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.555488 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.555501 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:07Z","lastTransitionTime":"2025-10-03T14:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.657588 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.657650 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.657662 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.657681 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.657698 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:07Z","lastTransitionTime":"2025-10-03T14:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.760238 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.760282 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.760293 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.760309 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.760320 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:07Z","lastTransitionTime":"2025-10-03T14:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.863015 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.863070 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.863093 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.863126 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.863150 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:07Z","lastTransitionTime":"2025-10-03T14:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.965596 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.965656 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.965669 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.965688 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:07 crc kubenswrapper[4774]: I1003 14:44:07.965703 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:07Z","lastTransitionTime":"2025-10-03T14:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.068587 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.068632 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.068649 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.068668 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.068683 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:08Z","lastTransitionTime":"2025-10-03T14:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.170565 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.170618 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.170634 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.170656 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.170668 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:08Z","lastTransitionTime":"2025-10-03T14:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.274132 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.274174 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.274183 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.274198 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.274208 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:08Z","lastTransitionTime":"2025-10-03T14:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.299252 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.299354 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.299495 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:44:08 crc kubenswrapper[4774]: E1003 14:44:08.299489 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:44:08 crc kubenswrapper[4774]: E1003 14:44:08.299591 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:44:08 crc kubenswrapper[4774]: E1003 14:44:08.299719 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.300453 4774 scope.go:117] "RemoveContainer" containerID="d8a4afd85c0d75edd3184845b7cd816499325c7f7ecef0f46b961419dfe6dd86" Oct 03 14:44:08 crc kubenswrapper[4774]: E1003 14:44:08.300709 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jzv75_openshift-ovn-kubernetes(01bef0c3-23b3-4f49-8c33-3f2ec7503b12)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.376483 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.376523 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.376534 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.376554 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.376565 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:08Z","lastTransitionTime":"2025-10-03T14:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.478878 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.478949 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.478967 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.478989 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.479005 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:08Z","lastTransitionTime":"2025-10-03T14:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.581283 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.581360 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.581393 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.581409 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.581419 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:08Z","lastTransitionTime":"2025-10-03T14:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.683279 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.683347 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.683361 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.683409 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.683423 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:08Z","lastTransitionTime":"2025-10-03T14:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.794056 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.794095 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.794105 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.794119 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.794128 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:08Z","lastTransitionTime":"2025-10-03T14:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.896133 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.896162 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.896170 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.896183 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.896193 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:08Z","lastTransitionTime":"2025-10-03T14:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.998015 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.998066 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.998088 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.998102 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:08 crc kubenswrapper[4774]: I1003 14:44:08.998110 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:08Z","lastTransitionTime":"2025-10-03T14:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.100623 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.100663 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.100684 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.100706 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.100716 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:09Z","lastTransitionTime":"2025-10-03T14:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.202660 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.202696 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.202706 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.202720 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.202729 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:09Z","lastTransitionTime":"2025-10-03T14:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.299236 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:44:09 crc kubenswrapper[4774]: E1003 14:44:09.299434 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.303841 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.303869 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.303876 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.303887 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.303896 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:09Z","lastTransitionTime":"2025-10-03T14:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.314731 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a01f2ab-7f7c-411c-b424-0d382dee6976\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b65b48ad13df81dd9ccb04fd8aeebbf976409d5d318f51a923c2dfc0cd8cc7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3803c97bfaf2ffb04a8a6924cb97ee022c7d34a38e1a50ce63a7b8f062fc901d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jlwtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:09Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.330075 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://686819b674279f59619905ac4ba451ca79c97fc652a773307eca3fad2352ddda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:09Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.357846 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd06fa447f9a388f7cabafaff05c8d6dc9aedc161d2097f81855318c8b7712e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b330d308f14c0b3b3630e89fe3e43728e17dbc3fe78cf9cae365a3ab0474c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b575a360d2ee43bc2629635a4ffda77a1b0e1a171bbd99b76b725784be537e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae8d985dec947c332efa41abbc5c30a978f06942cc62a86ab45e90fc2e5d763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aaa74c70ca8d93d672246f3451771bffa1e304c88a453cbbe3a305a304954fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0d624981ca616d27057de09511f8a17b643686871e2e77c15bd1c65fba9c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8a4afd85c0d75edd3184845b7cd816499325c7f7ecef0f46b961419dfe6dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8a4afd85c0d75edd3184845b7cd816499325c7f7ecef0f46b961419dfe6dd86\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 14:43:53.739938 6460 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1003 14:43:53.739975 6460 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1003 14:43:53.739997 6460 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1003 14:43:53.740056 6460 factory.go:1336] Added *v1.Node event handler 7\\\\nI1003 14:43:53.740098 6460 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1003 14:43:53.740599 6460 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1003 14:43:53.740805 6460 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1003 14:43:53.740883 6460 ovnkube.go:599] Stopped ovnkube\\\\nI1003 14:43:53.740916 6460 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 14:43:53.741090 6460 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jzv75_openshift-ovn-kubernetes(01bef0c3-23b3-4f49-8c33-3f2ec7503b12)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac1b1bb4b30aa1d11558ff126ad87e4dd7b9aaf255cd4a1860aacdee30a0bb15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:09Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.379092 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ghf5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d3c89f-9fbd-4d50-840a-c5c78528c903\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ghf5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:09Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.391706 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:09Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.404198 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:09Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.405424 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.405460 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.405504 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.405521 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.405533 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:09Z","lastTransitionTime":"2025-10-03T14:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.418908 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:09Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.434231 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:09Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.446540 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:09Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.462817 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:09Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.475972 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:09Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.496077 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:09Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.508689 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.508692 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:09Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.508735 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.508746 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.508761 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.508773 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:09Z","lastTransitionTime":"2025-10-03T14:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.518360 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:09Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.530743 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:09Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.543846 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16e8c0249ee2448082160abe417f67a17fac176add76d00dd5e755c0f9a6a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:09Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.553128 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:09Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.564065 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5f4df4f-152d-4e0f-b040-8772ae05ccbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a653e60009bbcb51465ae267a42828fbf1e9863cc50c73ab2f3b490b5f1524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4623e559e31f88609bb4a406374a3ce997e19ca75d71bc291cb98ed209f1e2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e63db8eac87987c01cd81f4cc676a2bb7268ee091097ce8b1ed6a14bc3f2346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://272d3589475ea3f31e989b1c7326b814559a8c6cb5f9e5e042ffba28cb1b7c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://272d3589475ea3f31e989b1c7326b814559a8c6cb5f9e5e042ffba28cb1b7c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:09Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.611292 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.611328 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.611339 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.611354 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.611365 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:09Z","lastTransitionTime":"2025-10-03T14:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.712768 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.712813 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.712824 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.712843 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.712853 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:09Z","lastTransitionTime":"2025-10-03T14:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.815648 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.815694 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.815702 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.815717 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.815726 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:09Z","lastTransitionTime":"2025-10-03T14:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.917871 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.917905 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.917914 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.917928 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:09 crc kubenswrapper[4774]: I1003 14:44:09.917937 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:09Z","lastTransitionTime":"2025-10-03T14:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.020426 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.020479 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.020601 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.020623 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.020639 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:10Z","lastTransitionTime":"2025-10-03T14:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.123192 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.123239 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.123251 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.123268 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.123279 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:10Z","lastTransitionTime":"2025-10-03T14:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.226219 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.226265 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.226279 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.226296 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.226308 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:10Z","lastTransitionTime":"2025-10-03T14:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.299186 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.299245 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.299245 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:44:10 crc kubenswrapper[4774]: E1003 14:44:10.299319 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:44:10 crc kubenswrapper[4774]: E1003 14:44:10.299539 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:44:10 crc kubenswrapper[4774]: E1003 14:44:10.299669 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.327919 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.327978 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.327992 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.328007 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.328019 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:10Z","lastTransitionTime":"2025-10-03T14:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.430308 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.430385 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.430401 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.430419 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.430435 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:10Z","lastTransitionTime":"2025-10-03T14:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.532680 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.532714 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.532725 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.532740 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.532751 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:10Z","lastTransitionTime":"2025-10-03T14:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.636364 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.636448 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.636457 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.636475 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.636506 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:10Z","lastTransitionTime":"2025-10-03T14:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.739309 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.739408 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.739431 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.739460 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.739483 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:10Z","lastTransitionTime":"2025-10-03T14:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.841533 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.841868 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.841968 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.842070 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.842179 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:10Z","lastTransitionTime":"2025-10-03T14:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.945146 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.945209 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.945230 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.945257 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:10 crc kubenswrapper[4774]: I1003 14:44:10.945275 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:10Z","lastTransitionTime":"2025-10-03T14:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.047430 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.047461 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.047470 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.047482 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.047492 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:11Z","lastTransitionTime":"2025-10-03T14:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.149091 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.149131 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.149141 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.149154 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.149164 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:11Z","lastTransitionTime":"2025-10-03T14:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.251476 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.251562 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.251583 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.251609 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.251627 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:11Z","lastTransitionTime":"2025-10-03T14:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.299018 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:44:11 crc kubenswrapper[4774]: E1003 14:44:11.299144 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.353322 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.353615 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.353709 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.353806 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.353893 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:11Z","lastTransitionTime":"2025-10-03T14:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.456281 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.456845 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.457006 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.457147 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.457280 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:11Z","lastTransitionTime":"2025-10-03T14:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.560324 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.560362 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.560395 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.560414 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.560426 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:11Z","lastTransitionTime":"2025-10-03T14:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.663402 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.663650 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.663714 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.663778 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.663871 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:11Z","lastTransitionTime":"2025-10-03T14:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.766068 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.766362 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.766483 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.766577 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.766660 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:11Z","lastTransitionTime":"2025-10-03T14:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.869228 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.869270 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.869283 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.869299 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.869310 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:11Z","lastTransitionTime":"2025-10-03T14:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.972043 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.972102 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.972115 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.972138 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:11 crc kubenswrapper[4774]: I1003 14:44:11.972150 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:11Z","lastTransitionTime":"2025-10-03T14:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.049099 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88d3c89f-9fbd-4d50-840a-c5c78528c903-metrics-certs\") pod \"network-metrics-daemon-ghf5t\" (UID: \"88d3c89f-9fbd-4d50-840a-c5c78528c903\") " pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:44:12 crc kubenswrapper[4774]: E1003 14:44:12.049290 4774 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 14:44:12 crc kubenswrapper[4774]: E1003 14:44:12.049741 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88d3c89f-9fbd-4d50-840a-c5c78528c903-metrics-certs podName:88d3c89f-9fbd-4d50-840a-c5c78528c903 nodeName:}" failed. No retries permitted until 2025-10-03 14:44:44.049716769 +0000 UTC m=+106.638920251 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/88d3c89f-9fbd-4d50-840a-c5c78528c903-metrics-certs") pod "network-metrics-daemon-ghf5t" (UID: "88d3c89f-9fbd-4d50-840a-c5c78528c903") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.074469 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.074541 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.074558 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.074586 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.074609 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:12Z","lastTransitionTime":"2025-10-03T14:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.176549 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.176593 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.176609 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.176628 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.176644 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:12Z","lastTransitionTime":"2025-10-03T14:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.278441 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.278483 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.278495 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.278512 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.278523 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:12Z","lastTransitionTime":"2025-10-03T14:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.298794 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:44:12 crc kubenswrapper[4774]: E1003 14:44:12.299049 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.298822 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:44:12 crc kubenswrapper[4774]: E1003 14:44:12.299143 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.298822 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:44:12 crc kubenswrapper[4774]: E1003 14:44:12.299198 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.380922 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.380973 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.380990 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.381012 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.381029 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:12Z","lastTransitionTime":"2025-10-03T14:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.483101 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.483157 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.483176 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.483197 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.483214 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:12Z","lastTransitionTime":"2025-10-03T14:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.585418 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.585453 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.585479 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.585493 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.585502 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:12Z","lastTransitionTime":"2025-10-03T14:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.688072 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.688287 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.688396 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.688531 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.688797 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:12Z","lastTransitionTime":"2025-10-03T14:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.791505 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.791748 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.791855 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.791979 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.792096 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:12Z","lastTransitionTime":"2025-10-03T14:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.895660 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.895726 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.895741 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.895761 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.895774 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:12Z","lastTransitionTime":"2025-10-03T14:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.998740 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.998803 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.998826 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.998850 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:12 crc kubenswrapper[4774]: I1003 14:44:12.998867 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:12Z","lastTransitionTime":"2025-10-03T14:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.101587 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.101894 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.102073 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.102236 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.102408 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:13Z","lastTransitionTime":"2025-10-03T14:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.205600 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.205665 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.205682 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.205705 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.205723 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:13Z","lastTransitionTime":"2025-10-03T14:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.299163 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:44:13 crc kubenswrapper[4774]: E1003 14:44:13.299325 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.307996 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.308028 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.308042 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.308056 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.308067 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:13Z","lastTransitionTime":"2025-10-03T14:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.410608 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.410705 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.410724 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.410852 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.410907 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:13Z","lastTransitionTime":"2025-10-03T14:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.513585 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.514094 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.514214 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.514320 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.514431 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:13Z","lastTransitionTime":"2025-10-03T14:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.580732 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.581020 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.581182 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.581310 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.581429 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:13Z","lastTransitionTime":"2025-10-03T14:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:13 crc kubenswrapper[4774]: E1003 14:44:13.603181 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.607777 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.607826 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.607843 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.607865 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.607879 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:13Z","lastTransitionTime":"2025-10-03T14:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:13 crc kubenswrapper[4774]: E1003 14:44:13.625850 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.631000 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.631045 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.631057 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.631075 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.631092 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:13Z","lastTransitionTime":"2025-10-03T14:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:13 crc kubenswrapper[4774]: E1003 14:44:13.649765 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.654428 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.654483 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.654498 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.654516 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.654527 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:13Z","lastTransitionTime":"2025-10-03T14:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:13 crc kubenswrapper[4774]: E1003 14:44:13.669616 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.674149 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.674207 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.674225 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.674251 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.674273 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:13Z","lastTransitionTime":"2025-10-03T14:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:13 crc kubenswrapper[4774]: E1003 14:44:13.694505 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:13 crc kubenswrapper[4774]: E1003 14:44:13.694739 4774 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.696581 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.696935 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.697093 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.697260 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.697473 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:13Z","lastTransitionTime":"2025-10-03T14:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.801014 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.801068 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.801080 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.801099 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.801112 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:13Z","lastTransitionTime":"2025-10-03T14:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.903303 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.903342 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.903356 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.903447 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:13 crc kubenswrapper[4774]: I1003 14:44:13.903465 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:13Z","lastTransitionTime":"2025-10-03T14:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.006031 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.006108 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.006124 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.006148 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.006165 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:14Z","lastTransitionTime":"2025-10-03T14:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.108502 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.108536 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.108548 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.108561 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.108569 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:14Z","lastTransitionTime":"2025-10-03T14:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.211435 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.211502 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.211513 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.211532 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.211545 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:14Z","lastTransitionTime":"2025-10-03T14:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.298905 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.298984 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.299112 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:44:14 crc kubenswrapper[4774]: E1003 14:44:14.299098 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:44:14 crc kubenswrapper[4774]: E1003 14:44:14.299286 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:44:14 crc kubenswrapper[4774]: E1003 14:44:14.299521 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.313952 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.314123 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.314241 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.314348 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.314491 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:14Z","lastTransitionTime":"2025-10-03T14:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.416659 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.416726 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.416737 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.416752 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.416762 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:14Z","lastTransitionTime":"2025-10-03T14:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.520202 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.520235 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.520247 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.520316 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.520332 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:14Z","lastTransitionTime":"2025-10-03T14:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.623682 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.623732 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.623755 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.623781 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.623802 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:14Z","lastTransitionTime":"2025-10-03T14:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.725947 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.726241 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.726335 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.726459 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.726540 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:14Z","lastTransitionTime":"2025-10-03T14:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.829622 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.829698 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.829724 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.829754 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.829780 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:14Z","lastTransitionTime":"2025-10-03T14:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.932920 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.932969 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.932984 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.933004 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:14 crc kubenswrapper[4774]: I1003 14:44:14.933018 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:14Z","lastTransitionTime":"2025-10-03T14:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.036954 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.037020 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.037035 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.037053 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.037067 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:15Z","lastTransitionTime":"2025-10-03T14:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.139995 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.140684 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.140727 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.140768 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.140787 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:15Z","lastTransitionTime":"2025-10-03T14:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.243082 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.243132 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.243147 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.243165 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.243180 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:15Z","lastTransitionTime":"2025-10-03T14:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.299031 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:44:15 crc kubenswrapper[4774]: E1003 14:44:15.299172 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.344878 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.344918 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.344934 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.344952 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.344962 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:15Z","lastTransitionTime":"2025-10-03T14:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.447579 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.447632 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.447648 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.447668 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.447685 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:15Z","lastTransitionTime":"2025-10-03T14:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.550162 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.550204 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.550214 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.550231 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.550242 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:15Z","lastTransitionTime":"2025-10-03T14:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.652942 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.652986 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.652996 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.653012 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.653021 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:15Z","lastTransitionTime":"2025-10-03T14:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.724717 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jk5hb_4f2cc8dc-61c3-4a0b-8da3-b899094eaa53/kube-multus/0.log" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.724774 4774 generic.go:334] "Generic (PLEG): container finished" podID="4f2cc8dc-61c3-4a0b-8da3-b899094eaa53" containerID="67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b" exitCode=1 Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.724805 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jk5hb" event={"ID":"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53","Type":"ContainerDied","Data":"67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b"} Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.725196 4774 scope.go:117] "RemoveContainer" containerID="67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.757003 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd06fa447f9a388f7cabafaff05c8d6dc9aedc161d2097f81855318c8b7712e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b330d308f14c0b3b3630e89fe3e43728e17dbc3fe78cf9cae365a3ab0474c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b575a360d2ee43bc2629635a4ffda77a1b0e1a171bbd99b76b725784be537e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae8d985dec947c332efa41abbc5c30a978f06942cc62a86ab45e90fc2e5d763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aaa74c70ca8d93d672246f3451771bffa1e304c88a453cbbe3a305a304954fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0d624981ca616d27057de09511f8a17b643686871e2e77c15bd1c65fba9c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8a4afd85c0d75edd3184845b7cd816499325c7f7ecef0f46b961419dfe6dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8a4afd85c0d75edd3184845b7cd816499325c7f7ecef0f46b961419dfe6dd86\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 14:43:53.739938 6460 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1003 14:43:53.739975 6460 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1003 14:43:53.739997 6460 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1003 14:43:53.740056 6460 factory.go:1336] Added *v1.Node event handler 7\\\\nI1003 14:43:53.740098 6460 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1003 14:43:53.740599 6460 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1003 14:43:53.740805 6460 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1003 14:43:53.740883 6460 ovnkube.go:599] Stopped ovnkube\\\\nI1003 14:43:53.740916 6460 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 14:43:53.741090 6460 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jzv75_openshift-ovn-kubernetes(01bef0c3-23b3-4f49-8c33-3f2ec7503b12)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac1b1bb4b30aa1d11558ff126ad87e4dd7b9aaf255cd4a1860aacdee30a0bb15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:15Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.757183 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.757251 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.757265 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.757290 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.757302 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:15Z","lastTransitionTime":"2025-10-03T14:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.772188 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ghf5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d3c89f-9fbd-4d50-840a-c5c78528c903\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ghf5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:15Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.788513 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:15Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.805975 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://686819b674279f59619905ac4ba451ca79c97fc652a773307eca3fad2352ddda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:15Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.820854 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:15Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.834598 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:15Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.848718 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:15Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.859567 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.859609 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.859620 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.859636 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.859648 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:15Z","lastTransitionTime":"2025-10-03T14:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.860919 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:15Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.874148 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:15Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.887715 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:15Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.909634 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:15Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.925812 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:15Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.936744 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:15Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.949471 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:44:15Z\\\",\\\"message\\\":\\\"2025-10-03T14:43:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6f69d832-7344-4942-a14b-a5a9f788cb85\\\\n2025-10-03T14:43:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6f69d832-7344-4942-a14b-a5a9f788cb85 to /host/opt/cni/bin/\\\\n2025-10-03T14:43:30Z [verbose] multus-daemon started\\\\n2025-10-03T14:43:30Z [verbose] Readiness Indicator file check\\\\n2025-10-03T14:44:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:15Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.962196 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.962228 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.962239 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.962251 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.962260 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:15Z","lastTransitionTime":"2025-10-03T14:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.962906 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16e8c0249ee2448082160abe417f67a17fac176add76d00dd5e755c0f9a6a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:15Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.972945 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:15Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.984774 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5f4df4f-152d-4e0f-b040-8772ae05ccbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a653e60009bbcb51465ae267a42828fbf1e9863cc50c73ab2f3b490b5f1524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4623e559e31f88609bb4a406374a3ce997e19ca75d71bc291cb98ed209f1e2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e63db8eac87987c01cd81f4cc676a2bb7268ee091097ce8b1ed6a14bc3f2346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://272d3589475ea3f31e989b1c7326b814559a8c6cb5f9e5e042ffba28cb1b7c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://272d3589475ea3f31e989b1c7326b814559a8c6cb5f9e5e042ffba28cb1b7c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:15Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:15 crc kubenswrapper[4774]: I1003 14:44:15.996671 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a01f2ab-7f7c-411c-b424-0d382dee6976\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b65b48ad13df81dd9ccb04fd8aeebbf976409d5d318f51a923c2dfc0cd8cc7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3803c97bfaf2ffb04a8a6924cb97ee022c7d34a38e1a50ce63a7b8f062fc901d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jlwtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:15Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.064675 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.064707 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.064717 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.064732 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.064741 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:16Z","lastTransitionTime":"2025-10-03T14:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.167550 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.167598 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.167613 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.167634 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.167650 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:16Z","lastTransitionTime":"2025-10-03T14:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.270210 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.270251 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.270261 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.270275 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.270283 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:16Z","lastTransitionTime":"2025-10-03T14:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.299027 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:44:16 crc kubenswrapper[4774]: E1003 14:44:16.299216 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.299050 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.299325 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:44:16 crc kubenswrapper[4774]: E1003 14:44:16.299565 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:44:16 crc kubenswrapper[4774]: E1003 14:44:16.299712 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.373156 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.373220 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.373237 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.373261 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.373278 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:16Z","lastTransitionTime":"2025-10-03T14:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.476550 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.476612 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.476630 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.476651 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.476668 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:16Z","lastTransitionTime":"2025-10-03T14:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.579526 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.579574 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.579591 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.579623 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.579645 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:16Z","lastTransitionTime":"2025-10-03T14:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.683002 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.683055 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.683070 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.683094 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.683110 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:16Z","lastTransitionTime":"2025-10-03T14:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.728889 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jk5hb_4f2cc8dc-61c3-4a0b-8da3-b899094eaa53/kube-multus/0.log" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.728947 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jk5hb" event={"ID":"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53","Type":"ContainerStarted","Data":"a17e01ed9b7f955272f2c5bb14a9624d7faeb0a4727698c093a2acc4e2da71af"} Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.741742 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.753707 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://686819b674279f59619905ac4ba451ca79c97fc652a773307eca3fad2352ddda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.770850 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd06fa447f9a388f7cabafaff05c8d6dc9aedc161d2097f81855318c8b7712e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b330d308f14c0b3b3630e89fe3e43728e17dbc3fe78cf9cae365a3ab0474c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b575a360d2ee43bc2629635a4ffda77a1b0e1a171bbd99b76b725784be537e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae8d985dec947c332efa41abbc5c30a978f06942cc62a86ab45e90fc2e5d763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aaa74c70ca8d93d672246f3451771bffa1e304c88a453cbbe3a305a304954fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0d624981ca616d27057de09511f8a17b643686871e2e77c15bd1c65fba9c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8a4afd85c0d75edd3184845b7cd816499325c7f7ecef0f46b961419dfe6dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8a4afd85c0d75edd3184845b7cd816499325c7f7ecef0f46b961419dfe6dd86\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 14:43:53.739938 6460 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1003 14:43:53.739975 6460 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1003 14:43:53.739997 6460 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1003 14:43:53.740056 6460 factory.go:1336] Added *v1.Node event handler 7\\\\nI1003 14:43:53.740098 6460 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1003 14:43:53.740599 6460 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1003 14:43:53.740805 6460 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1003 14:43:53.740883 6460 ovnkube.go:599] Stopped ovnkube\\\\nI1003 14:43:53.740916 6460 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 14:43:53.741090 6460 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jzv75_openshift-ovn-kubernetes(01bef0c3-23b3-4f49-8c33-3f2ec7503b12)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac1b1bb4b30aa1d11558ff126ad87e4dd7b9aaf255cd4a1860aacdee30a0bb15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.782199 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ghf5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d3c89f-9fbd-4d50-840a-c5c78528c903\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ghf5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.785452 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.785479 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.785488 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.785500 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.785510 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:16Z","lastTransitionTime":"2025-10-03T14:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.795810 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.807406 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.819258 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.833662 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.845314 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.855074 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.867592 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.880743 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17e01ed9b7f955272f2c5bb14a9624d7faeb0a4727698c093a2acc4e2da71af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:44:15Z\\\",\\\"message\\\":\\\"2025-10-03T14:43:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6f69d832-7344-4942-a14b-a5a9f788cb85\\\\n2025-10-03T14:43:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6f69d832-7344-4942-a14b-a5a9f788cb85 to /host/opt/cni/bin/\\\\n2025-10-03T14:43:30Z [verbose] multus-daemon started\\\\n2025-10-03T14:43:30Z [verbose] Readiness Indicator file check\\\\n2025-10-03T14:44:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.888114 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.888149 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.888158 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.888172 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.888181 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:16Z","lastTransitionTime":"2025-10-03T14:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.894475 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16e8c0249ee2448082160abe417f67a17fac176add76d00dd5e755c0f9a6a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.903554 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.912882 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5f4df4f-152d-4e0f-b040-8772ae05ccbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a653e60009bbcb51465ae267a42828fbf1e9863cc50c73ab2f3b490b5f1524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4623e559e31f88609bb4a406374a3ce997e19ca75d71bc291cb98ed209f1e2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e63db8eac87987c01cd81f4cc676a2bb7268ee091097ce8b1ed6a14bc3f2346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://272d3589475ea3f31e989b1c7326b814559a8c6cb5f9e5e042ffba28cb1b7c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://272d3589475ea3f31e989b1c7326b814559a8c6cb5f9e5e042ffba28cb1b7c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.929404 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.940847 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.951605 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a01f2ab-7f7c-411c-b424-0d382dee6976\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b65b48ad13df81dd9ccb04fd8aeebbf976409d5d318f51a923c2dfc0cd8cc7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3803c97bfaf2ffb04a8a6924cb97ee022c7d34a38e1a50ce63a7b8f062fc901d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jlwtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.990528 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.990581 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.990599 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.990623 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:16 crc kubenswrapper[4774]: I1003 14:44:16.990640 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:16Z","lastTransitionTime":"2025-10-03T14:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.093216 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.093259 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.093272 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.093290 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.093302 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:17Z","lastTransitionTime":"2025-10-03T14:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.195481 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.195534 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.195551 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.195577 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.195594 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:17Z","lastTransitionTime":"2025-10-03T14:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.297630 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.297683 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.297697 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.297715 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.297727 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:17Z","lastTransitionTime":"2025-10-03T14:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.298684 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:44:17 crc kubenswrapper[4774]: E1003 14:44:17.298901 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.400558 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.400598 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.400609 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.400632 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.400645 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:17Z","lastTransitionTime":"2025-10-03T14:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.502504 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.502549 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.502561 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.502586 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.502598 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:17Z","lastTransitionTime":"2025-10-03T14:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.605353 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.605423 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.605440 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.605470 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.605491 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:17Z","lastTransitionTime":"2025-10-03T14:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.708702 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.708771 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.708794 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.708829 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.708853 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:17Z","lastTransitionTime":"2025-10-03T14:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.811473 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.811503 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.811513 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.811525 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.811535 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:17Z","lastTransitionTime":"2025-10-03T14:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.913888 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.913917 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.913927 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.913940 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:17 crc kubenswrapper[4774]: I1003 14:44:17.913949 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:17Z","lastTransitionTime":"2025-10-03T14:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.015775 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.015814 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.015825 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.015842 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.015854 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:18Z","lastTransitionTime":"2025-10-03T14:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.119455 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.119540 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.119566 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.119598 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.119619 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:18Z","lastTransitionTime":"2025-10-03T14:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.223158 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.223214 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.223233 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.223257 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.223274 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:18Z","lastTransitionTime":"2025-10-03T14:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.298649 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.298701 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:44:18 crc kubenswrapper[4774]: E1003 14:44:18.298858 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.298920 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:44:18 crc kubenswrapper[4774]: E1003 14:44:18.299133 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:44:18 crc kubenswrapper[4774]: E1003 14:44:18.299264 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.326573 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.326623 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.326640 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.326664 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.326684 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:18Z","lastTransitionTime":"2025-10-03T14:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.430783 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.430858 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.430945 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.431034 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.431064 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:18Z","lastTransitionTime":"2025-10-03T14:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.533661 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.533723 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.533740 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.533768 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.533784 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:18Z","lastTransitionTime":"2025-10-03T14:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.636912 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.636982 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.637021 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.637048 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.637064 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:18Z","lastTransitionTime":"2025-10-03T14:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.738461 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.738525 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.738543 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.738566 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.738583 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:18Z","lastTransitionTime":"2025-10-03T14:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.841337 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.841472 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.841487 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.841521 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.841532 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:18Z","lastTransitionTime":"2025-10-03T14:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.944415 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.944487 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.944515 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.944542 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:18 crc kubenswrapper[4774]: I1003 14:44:18.944562 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:18Z","lastTransitionTime":"2025-10-03T14:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.048208 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.048241 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.048249 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.048262 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.048271 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:19Z","lastTransitionTime":"2025-10-03T14:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.151105 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.151143 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.151156 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.151171 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.151184 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:19Z","lastTransitionTime":"2025-10-03T14:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.253654 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.253711 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.253721 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.253735 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.253745 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:19Z","lastTransitionTime":"2025-10-03T14:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.298670 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:44:19 crc kubenswrapper[4774]: E1003 14:44:19.299308 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.299946 4774 scope.go:117] "RemoveContainer" containerID="d8a4afd85c0d75edd3184845b7cd816499325c7f7ecef0f46b961419dfe6dd86" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.312524 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.326651 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.342888 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.357100 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.357135 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.357150 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.357167 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.357178 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:19Z","lastTransitionTime":"2025-10-03T14:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.357877 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.372435 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.385273 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.409970 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17e01ed9b7f955272f2c5bb14a9624d7faeb0a4727698c093a2acc4e2da71af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:44:15Z\\\",\\\"message\\\":\\\"2025-10-03T14:43:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6f69d832-7344-4942-a14b-a5a9f788cb85\\\\n2025-10-03T14:43:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6f69d832-7344-4942-a14b-a5a9f788cb85 to /host/opt/cni/bin/\\\\n2025-10-03T14:43:30Z [verbose] multus-daemon started\\\\n2025-10-03T14:43:30Z [verbose] Readiness Indicator file check\\\\n2025-10-03T14:44:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.427331 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16e8c0249ee2448082160abe417f67a17fac176add76d00dd5e755c0f9a6a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.436584 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.449561 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5f4df4f-152d-4e0f-b040-8772ae05ccbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a653e60009bbcb51465ae267a42828fbf1e9863cc50c73ab2f3b490b5f1524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4623e559e31f88609bb4a406374a3ce997e19ca75d71bc291cb98ed209f1e2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e63db8eac87987c01cd81f4cc676a2bb7268ee091097ce8b1ed6a14bc3f2346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://272d3589475ea3f31e989b1c7326b814559a8c6cb5f9e5e042ffba28cb1b7c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://272d3589475ea3f31e989b1c7326b814559a8c6cb5f9e5e042ffba28cb1b7c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.458943 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.458971 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.458979 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.458991 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.459000 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:19Z","lastTransitionTime":"2025-10-03T14:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.478334 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.497603 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.511605 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.526226 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a01f2ab-7f7c-411c-b424-0d382dee6976\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b65b48ad13df81dd9ccb04fd8aeebbf976409d5d318f51a923c2dfc0cd8cc7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3803c97bfaf2ffb04a8a6924cb97ee022c7d34a38e1a50ce63a7b8f062fc901d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jlwtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.541267 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.561604 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://686819b674279f59619905ac4ba451ca79c97fc652a773307eca3fad2352ddda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.562545 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.562600 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.562612 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.562628 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.562640 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:19Z","lastTransitionTime":"2025-10-03T14:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.583051 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd06fa447f9a388f7cabafaff05c8d6dc9aedc161d2097f81855318c8b7712e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b330d308f14c0b3b3630e89fe3e43728e17dbc3fe78cf9cae365a3ab0474c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b575a360d2ee43bc2629635a4ffda77a1b0e1a171bbd99b76b725784be537e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae8d985dec947c332efa41abbc5c30a978f06942cc62a86ab45e90fc2e5d763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aaa74c70ca8d93d672246f3451771bffa1e304c88a453cbbe3a305a304954fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0d624981ca616d27057de09511f8a17b643686871e2e77c15bd1c65fba9c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8a4afd85c0d75edd3184845b7cd816499325c7f7ecef0f46b961419dfe6dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8a4afd85c0d75edd3184845b7cd816499325c7f7ecef0f46b961419dfe6dd86\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 14:43:53.739938 6460 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1003 14:43:53.739975 6460 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1003 14:43:53.739997 6460 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1003 14:43:53.740056 6460 factory.go:1336] Added *v1.Node event handler 7\\\\nI1003 14:43:53.740098 6460 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1003 14:43:53.740599 6460 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1003 14:43:53.740805 6460 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1003 14:43:53.740883 6460 ovnkube.go:599] Stopped ovnkube\\\\nI1003 14:43:53.740916 6460 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 14:43:53.741090 6460 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jzv75_openshift-ovn-kubernetes(01bef0c3-23b3-4f49-8c33-3f2ec7503b12)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac1b1bb4b30aa1d11558ff126ad87e4dd7b9aaf255cd4a1860aacdee30a0bb15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.597598 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ghf5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d3c89f-9fbd-4d50-840a-c5c78528c903\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ghf5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.664611 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.664650 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.664667 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.664692 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.664710 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:19Z","lastTransitionTime":"2025-10-03T14:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.740793 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzv75_01bef0c3-23b3-4f49-8c33-3f2ec7503b12/ovnkube-controller/2.log" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.743207 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" event={"ID":"01bef0c3-23b3-4f49-8c33-3f2ec7503b12","Type":"ContainerStarted","Data":"d5d7482646eb7f611642d40cc2246f4325e1fdc7d5b08eab0ac5777911a9b209"} Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.744271 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.762867 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.767519 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.767556 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.767574 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.767595 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.767612 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:19Z","lastTransitionTime":"2025-10-03T14:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.775625 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.784779 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.796541 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17e01ed9b7f955272f2c5bb14a9624d7faeb0a4727698c093a2acc4e2da71af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:44:15Z\\\",\\\"message\\\":\\\"2025-10-03T14:43:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6f69d832-7344-4942-a14b-a5a9f788cb85\\\\n2025-10-03T14:43:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6f69d832-7344-4942-a14b-a5a9f788cb85 to /host/opt/cni/bin/\\\\n2025-10-03T14:43:30Z [verbose] multus-daemon started\\\\n2025-10-03T14:43:30Z [verbose] Readiness Indicator file check\\\\n2025-10-03T14:44:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.816154 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16e8c0249ee2448082160abe417f67a17fac176add76d00dd5e755c0f9a6a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.832764 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.847983 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5f4df4f-152d-4e0f-b040-8772ae05ccbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a653e60009bbcb51465ae267a42828fbf1e9863cc50c73ab2f3b490b5f1524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4623e559e31f88609bb4a406374a3ce997e19ca75d71bc291cb98ed209f1e2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e63db8eac87987c01cd81f4cc676a2bb7268ee091097ce8b1ed6a14bc3f2346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://272d3589475ea3f31e989b1c7326b814559a8c6cb5f9e5e042ffba28cb1b7c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://272d3589475ea3f31e989b1c7326b814559a8c6cb5f9e5e042ffba28cb1b7c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.866245 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a01f2ab-7f7c-411c-b424-0d382dee6976\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b65b48ad13df81dd9ccb04fd8aeebbf976409d5d318f51a923c2dfc0cd8cc7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3803c97bfaf2ffb04a8a6924cb97ee022c7d34a38e1a50ce63a7b8f062fc901d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jlwtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.869054 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.869086 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.869097 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.869115 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.869127 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:19Z","lastTransitionTime":"2025-10-03T14:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.895562 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://686819b674279f59619905ac4ba451ca79c97fc652a773307eca3fad2352ddda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.915314 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd06fa447f9a388f7cabafaff05c8d6dc9aedc161d2097f81855318c8b7712e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b330d308f14c0b3b3630e89fe3e43728e17dbc3fe78cf9cae365a3ab0474c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b575a360d2ee43bc2629635a4ffda77a1b0e1a171bbd99b76b725784be537e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae8d985dec947c332efa41abbc5c30a978f06942cc62a86ab45e90fc2e5d763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aaa74c70ca8d93d672246f3451771bffa1e304c88a453cbbe3a305a304954fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0d624981ca616d27057de09511f8a17b643686871e2e77c15bd1c65fba9c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5d7482646eb7f611642d40cc2246f4325e1fdc7d5b08eab0ac5777911a9b209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8a4afd85c0d75edd3184845b7cd816499325c7f7ecef0f46b961419dfe6dd86\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 14:43:53.739938 6460 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1003 14:43:53.739975 6460 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1003 14:43:53.739997 6460 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1003 14:43:53.740056 6460 factory.go:1336] Added *v1.Node event handler 7\\\\nI1003 14:43:53.740098 6460 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1003 14:43:53.740599 6460 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1003 14:43:53.740805 6460 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1003 14:43:53.740883 6460 ovnkube.go:599] Stopped ovnkube\\\\nI1003 14:43:53.740916 6460 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 14:43:53.741090 6460 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac1b1bb4b30aa1d11558ff126ad87e4dd7b9aaf255cd4a1860aacdee30a0bb15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.928026 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ghf5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d3c89f-9fbd-4d50-840a-c5c78528c903\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ghf5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.938827 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.950761 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.969152 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.971672 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.971707 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.971717 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.971732 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.971742 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:19Z","lastTransitionTime":"2025-10-03T14:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.982275 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:19 crc kubenswrapper[4774]: I1003 14:44:19.993714 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.004182 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.015762 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.074118 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.074210 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.074226 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.074245 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.074257 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:20Z","lastTransitionTime":"2025-10-03T14:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.176257 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.176289 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.176298 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.176311 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.176320 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:20Z","lastTransitionTime":"2025-10-03T14:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.278814 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.278871 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.278888 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.278910 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.278953 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:20Z","lastTransitionTime":"2025-10-03T14:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.325788 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.325845 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.325788 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:44:20 crc kubenswrapper[4774]: E1003 14:44:20.325946 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:44:20 crc kubenswrapper[4774]: E1003 14:44:20.326065 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:44:20 crc kubenswrapper[4774]: E1003 14:44:20.326160 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.381747 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.381817 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.381835 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.381860 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.381878 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:20Z","lastTransitionTime":"2025-10-03T14:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.484657 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.484730 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.484750 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.484771 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.484954 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:20Z","lastTransitionTime":"2025-10-03T14:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.587188 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.587236 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.587248 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.587272 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.587289 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:20Z","lastTransitionTime":"2025-10-03T14:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.690065 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.690115 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.690129 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.690149 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.690161 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:20Z","lastTransitionTime":"2025-10-03T14:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.747774 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzv75_01bef0c3-23b3-4f49-8c33-3f2ec7503b12/ovnkube-controller/3.log" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.748443 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzv75_01bef0c3-23b3-4f49-8c33-3f2ec7503b12/ovnkube-controller/2.log" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.751031 4774 generic.go:334] "Generic (PLEG): container finished" podID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerID="d5d7482646eb7f611642d40cc2246f4325e1fdc7d5b08eab0ac5777911a9b209" exitCode=1 Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.751066 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" event={"ID":"01bef0c3-23b3-4f49-8c33-3f2ec7503b12","Type":"ContainerDied","Data":"d5d7482646eb7f611642d40cc2246f4325e1fdc7d5b08eab0ac5777911a9b209"} Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.751117 4774 scope.go:117] "RemoveContainer" containerID="d8a4afd85c0d75edd3184845b7cd816499325c7f7ecef0f46b961419dfe6dd86" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.751880 4774 scope.go:117] "RemoveContainer" containerID="d5d7482646eb7f611642d40cc2246f4325e1fdc7d5b08eab0ac5777911a9b209" Oct 03 14:44:20 crc kubenswrapper[4774]: E1003 14:44:20.752083 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jzv75_openshift-ovn-kubernetes(01bef0c3-23b3-4f49-8c33-3f2ec7503b12)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.777868 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.792727 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.792782 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.792798 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.792823 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.792841 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:20Z","lastTransitionTime":"2025-10-03T14:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.794971 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://686819b674279f59619905ac4ba451ca79c97fc652a773307eca3fad2352ddda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.827437 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd06fa447f9a388f7cabafaff05c8d6dc9aedc161d2097f81855318c8b7712e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b330d308f14c0b3b3630e89fe3e43728e17dbc3fe78cf9cae365a3ab0474c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b575a360d2ee43bc2629635a4ffda77a1b0e1a171bbd99b76b725784be537e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae8d985dec947c332efa41abbc5c30a978f06942cc62a86ab45e90fc2e5d763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aaa74c70ca8d93d672246f3451771bffa1e304c88a453cbbe3a305a304954fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0d624981ca616d27057de09511f8a17b643686871e2e77c15bd1c65fba9c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5d7482646eb7f611642d40cc2246f4325e1fdc7d5b08eab0ac5777911a9b209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8a4afd85c0d75edd3184845b7cd816499325c7f7ecef0f46b961419dfe6dd86\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:43:53Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 14:43:53.739938 6460 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1003 14:43:53.739975 6460 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1003 14:43:53.739997 6460 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1003 14:43:53.740056 6460 factory.go:1336] Added *v1.Node event handler 7\\\\nI1003 14:43:53.740098 6460 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1003 14:43:53.740599 6460 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1003 14:43:53.740805 6460 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1003 14:43:53.740883 6460 ovnkube.go:599] Stopped ovnkube\\\\nI1003 14:43:53.740916 6460 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 14:43:53.741090 6460 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5d7482646eb7f611642d40cc2246f4325e1fdc7d5b08eab0ac5777911a9b209\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:44:20Z\\\",\\\"message\\\":\\\"n.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator-webhook]} name:Service_openshift-machine-api/machine-api-operator-webhook_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.254:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e4e4203e-87c7-4024-930a-5d6bdfe2bdde}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 14:44:20.331150 6849 ovnkube.go:599] Stopped ovnkube\\\\nI1003 14:44:20.331172 6849 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 14:44:20.331227 6849 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped a\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac1b1bb4b30aa1d11558ff126ad87e4dd7b9aaf255cd4a1860aacdee30a0bb15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.842591 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ghf5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d3c89f-9fbd-4d50-840a-c5c78528c903\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ghf5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.856652 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.875033 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.889626 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.895327 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.895400 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.895411 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.895424 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.895434 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:20Z","lastTransitionTime":"2025-10-03T14:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.901723 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.918199 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.930584 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.941884 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.955433 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5f4df4f-152d-4e0f-b040-8772ae05ccbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a653e60009bbcb51465ae267a42828fbf1e9863cc50c73ab2f3b490b5f1524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4623e559e31f88609bb4a406374a3ce997e19ca75d71bc291cb98ed209f1e2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e63db8eac87987c01cd81f4cc676a2bb7268ee091097ce8b1ed6a14bc3f2346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://272d3589475ea3f31e989b1c7326b814559a8c6cb5f9e5e042ffba28cb1b7c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://272d3589475ea3f31e989b1c7326b814559a8c6cb5f9e5e042ffba28cb1b7c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.975136 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.989456 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.997787 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.997840 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.997860 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.997882 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:20 crc kubenswrapper[4774]: I1003 14:44:20.997899 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:20Z","lastTransitionTime":"2025-10-03T14:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.000241 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.017505 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17e01ed9b7f955272f2c5bb14a9624d7faeb0a4727698c093a2acc4e2da71af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:44:15Z\\\",\\\"message\\\":\\\"2025-10-03T14:43:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6f69d832-7344-4942-a14b-a5a9f788cb85\\\\n2025-10-03T14:43:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6f69d832-7344-4942-a14b-a5a9f788cb85 to /host/opt/cni/bin/\\\\n2025-10-03T14:43:30Z [verbose] multus-daemon started\\\\n2025-10-03T14:43:30Z [verbose] Readiness Indicator file check\\\\n2025-10-03T14:44:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:21Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.042361 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16e8c0249ee2448082160abe417f67a17fac176add76d00dd5e755c0f9a6a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:21Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.059968 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a01f2ab-7f7c-411c-b424-0d382dee6976\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b65b48ad13df81dd9ccb04fd8aeebbf976409d5d318f51a923c2dfc0cd8cc7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3803c97bfaf2ffb04a8a6924cb97ee022c7d34a38e1a50ce63a7b8f062fc901d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jlwtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:21Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.101000 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.101055 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.101071 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.101093 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.101112 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:21Z","lastTransitionTime":"2025-10-03T14:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.204088 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.204152 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.204167 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.204184 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.204196 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:21Z","lastTransitionTime":"2025-10-03T14:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.299227 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:44:21 crc kubenswrapper[4774]: E1003 14:44:21.299391 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.305847 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.305894 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.305908 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.305926 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.305939 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:21Z","lastTransitionTime":"2025-10-03T14:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.408246 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.408306 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.408330 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.408496 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.408511 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:21Z","lastTransitionTime":"2025-10-03T14:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.510885 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.510989 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.511003 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.511024 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.511040 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:21Z","lastTransitionTime":"2025-10-03T14:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.615738 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.615792 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.615804 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.615826 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.615839 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:21Z","lastTransitionTime":"2025-10-03T14:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.718691 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.718753 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.718772 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.718797 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.718816 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:21Z","lastTransitionTime":"2025-10-03T14:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.757064 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzv75_01bef0c3-23b3-4f49-8c33-3f2ec7503b12/ovnkube-controller/3.log" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.761129 4774 scope.go:117] "RemoveContainer" containerID="d5d7482646eb7f611642d40cc2246f4325e1fdc7d5b08eab0ac5777911a9b209" Oct 03 14:44:21 crc kubenswrapper[4774]: E1003 14:44:21.761529 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jzv75_openshift-ovn-kubernetes(01bef0c3-23b3-4f49-8c33-3f2ec7503b12)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.800205 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:21Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.820299 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:21Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.821244 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.821270 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.821283 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.821299 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.821311 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:21Z","lastTransitionTime":"2025-10-03T14:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.831746 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:21Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.845728 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17e01ed9b7f955272f2c5bb14a9624d7faeb0a4727698c093a2acc4e2da71af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:44:15Z\\\",\\\"message\\\":\\\"2025-10-03T14:43:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6f69d832-7344-4942-a14b-a5a9f788cb85\\\\n2025-10-03T14:43:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6f69d832-7344-4942-a14b-a5a9f788cb85 to /host/opt/cni/bin/\\\\n2025-10-03T14:43:30Z [verbose] multus-daemon started\\\\n2025-10-03T14:43:30Z [verbose] Readiness Indicator file check\\\\n2025-10-03T14:44:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:21Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.859228 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16e8c0249ee2448082160abe417f67a17fac176add76d00dd5e755c0f9a6a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:21Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.868788 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:21Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.878991 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5f4df4f-152d-4e0f-b040-8772ae05ccbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a653e60009bbcb51465ae267a42828fbf1e9863cc50c73ab2f3b490b5f1524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4623e559e31f88609bb4a406374a3ce997e19ca75d71bc291cb98ed209f1e2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e63db8eac87987c01cd81f4cc676a2bb7268ee091097ce8b1ed6a14bc3f2346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://272d3589475ea3f31e989b1c7326b814559a8c6cb5f9e5e042ffba28cb1b7c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://272d3589475ea3f31e989b1c7326b814559a8c6cb5f9e5e042ffba28cb1b7c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:21Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.887692 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a01f2ab-7f7c-411c-b424-0d382dee6976\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b65b48ad13df81dd9ccb04fd8aeebbf976409d5d318f51a923c2dfc0cd8cc7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3803c97bfaf2ffb04a8a6924cb97ee022c7d34a38e1a50ce63a7b8f062fc901d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jlwtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:21Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.898932 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://686819b674279f59619905ac4ba451ca79c97fc652a773307eca3fad2352ddda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:21Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.919070 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd06fa447f9a388f7cabafaff05c8d6dc9aedc161d2097f81855318c8b7712e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b330d308f14c0b3b3630e89fe3e43728e17dbc3fe78cf9cae365a3ab0474c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b575a360d2ee43bc2629635a4ffda77a1b0e1a171bbd99b76b725784be537e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae8d985dec947c332efa41abbc5c30a978f06942cc62a86ab45e90fc2e5d763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aaa74c70ca8d93d672246f3451771bffa1e304c88a453cbbe3a305a304954fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0d624981ca616d27057de09511f8a17b643686871e2e77c15bd1c65fba9c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5d7482646eb7f611642d40cc2246f4325e1fdc7d5b08eab0ac5777911a9b209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5d7482646eb7f611642d40cc2246f4325e1fdc7d5b08eab0ac5777911a9b209\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:44:20Z\\\",\\\"message\\\":\\\"n.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator-webhook]} name:Service_openshift-machine-api/machine-api-operator-webhook_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.254:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e4e4203e-87c7-4024-930a-5d6bdfe2bdde}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 14:44:20.331150 6849 ovnkube.go:599] Stopped ovnkube\\\\nI1003 14:44:20.331172 6849 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 14:44:20.331227 6849 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped a\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:44:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jzv75_openshift-ovn-kubernetes(01bef0c3-23b3-4f49-8c33-3f2ec7503b12)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac1b1bb4b30aa1d11558ff126ad87e4dd7b9aaf255cd4a1860aacdee30a0bb15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:21Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.923481 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.923523 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.923533 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.923546 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.923555 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:21Z","lastTransitionTime":"2025-10-03T14:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.930746 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ghf5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d3c89f-9fbd-4d50-840a-c5c78528c903\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ghf5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:21Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.944822 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:21Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.959482 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:21Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.972923 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:21Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.985599 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:21Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:21 crc kubenswrapper[4774]: I1003 14:44:21.999156 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:21Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.010227 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:22Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.021783 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:22Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.025489 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.025521 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.025532 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.025549 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.025561 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:22Z","lastTransitionTime":"2025-10-03T14:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.127718 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.127766 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.127782 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.127800 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.127813 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:22Z","lastTransitionTime":"2025-10-03T14:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.229704 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.229751 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.229760 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.229776 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.229785 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:22Z","lastTransitionTime":"2025-10-03T14:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.298592 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.298655 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.298652 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:44:22 crc kubenswrapper[4774]: E1003 14:44:22.298806 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:44:22 crc kubenswrapper[4774]: E1003 14:44:22.298897 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:44:22 crc kubenswrapper[4774]: E1003 14:44:22.299309 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.332568 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.332621 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.332632 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.332653 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.332664 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:22Z","lastTransitionTime":"2025-10-03T14:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.435313 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.435368 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.435402 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.435420 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.435430 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:22Z","lastTransitionTime":"2025-10-03T14:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.537835 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.537874 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.537882 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.537896 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.537908 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:22Z","lastTransitionTime":"2025-10-03T14:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.639945 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.639982 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.639994 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.640011 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.640023 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:22Z","lastTransitionTime":"2025-10-03T14:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.742478 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.742505 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.742514 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.742528 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.742538 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:22Z","lastTransitionTime":"2025-10-03T14:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.845344 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.845460 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.845478 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.845500 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.845517 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:22Z","lastTransitionTime":"2025-10-03T14:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.948167 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.948230 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.948248 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.948274 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:22 crc kubenswrapper[4774]: I1003 14:44:22.948296 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:22Z","lastTransitionTime":"2025-10-03T14:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.051414 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.051544 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.051561 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.051600 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.051613 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:23Z","lastTransitionTime":"2025-10-03T14:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.154101 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.154166 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.154201 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.154225 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.154244 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:23Z","lastTransitionTime":"2025-10-03T14:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.256015 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.256050 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.256060 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.256077 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.256089 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:23Z","lastTransitionTime":"2025-10-03T14:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.298769 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:44:23 crc kubenswrapper[4774]: E1003 14:44:23.298900 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.358911 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.358971 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.358986 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.359007 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.359021 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:23Z","lastTransitionTime":"2025-10-03T14:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.462061 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.462134 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.462153 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.462189 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.462208 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:23Z","lastTransitionTime":"2025-10-03T14:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.565799 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.565866 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.565886 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.565911 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.565925 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:23Z","lastTransitionTime":"2025-10-03T14:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.669385 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.669438 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.669448 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.669468 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.669485 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:23Z","lastTransitionTime":"2025-10-03T14:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.772033 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.772470 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.772631 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.772786 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.772932 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:23Z","lastTransitionTime":"2025-10-03T14:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.875903 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.875961 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.875977 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.876000 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.876017 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:23Z","lastTransitionTime":"2025-10-03T14:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.978441 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.978501 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.978521 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.978546 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:23 crc kubenswrapper[4774]: I1003 14:44:23.978564 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:23Z","lastTransitionTime":"2025-10-03T14:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.062971 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.063044 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.063068 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.063095 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.063117 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:24Z","lastTransitionTime":"2025-10-03T14:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:24 crc kubenswrapper[4774]: E1003 14:44:24.078951 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:24Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.083670 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.083744 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.083760 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.083780 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.083791 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:24Z","lastTransitionTime":"2025-10-03T14:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:24 crc kubenswrapper[4774]: E1003 14:44:24.100391 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:24Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.103775 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.103847 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.103860 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.103878 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.103888 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:24Z","lastTransitionTime":"2025-10-03T14:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:24 crc kubenswrapper[4774]: E1003 14:44:24.116074 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:24Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.120544 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.120582 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.120591 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.120606 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.120618 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:24Z","lastTransitionTime":"2025-10-03T14:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:24 crc kubenswrapper[4774]: E1003 14:44:24.134518 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:24Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.138650 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.138692 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.138707 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.138724 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.138761 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:24Z","lastTransitionTime":"2025-10-03T14:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:24 crc kubenswrapper[4774]: E1003 14:44:24.150515 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:24Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:24 crc kubenswrapper[4774]: E1003 14:44:24.150783 4774 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.152630 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.152695 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.152721 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.152751 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.152772 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:24Z","lastTransitionTime":"2025-10-03T14:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.256239 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.256280 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.256291 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.256308 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.256319 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:24Z","lastTransitionTime":"2025-10-03T14:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.298908 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.298959 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:44:24 crc kubenswrapper[4774]: E1003 14:44:24.299023 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.298916 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:44:24 crc kubenswrapper[4774]: E1003 14:44:24.299187 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:44:24 crc kubenswrapper[4774]: E1003 14:44:24.299297 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.359694 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.359765 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.359784 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.359811 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.359829 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:24Z","lastTransitionTime":"2025-10-03T14:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.462983 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.463279 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.463404 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.463487 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.463557 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:24Z","lastTransitionTime":"2025-10-03T14:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.566234 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.566281 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.566293 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.566310 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.566322 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:24Z","lastTransitionTime":"2025-10-03T14:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.670082 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.670151 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.670162 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.670179 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.670192 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:24Z","lastTransitionTime":"2025-10-03T14:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.773701 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.773756 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.773771 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.773790 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.773809 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:24Z","lastTransitionTime":"2025-10-03T14:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.876174 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.876209 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.876218 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.876232 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.876240 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:24Z","lastTransitionTime":"2025-10-03T14:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.979028 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.979076 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.979087 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.979104 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:24 crc kubenswrapper[4774]: I1003 14:44:24.979117 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:24Z","lastTransitionTime":"2025-10-03T14:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.082593 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.082659 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.082677 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.082702 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.082719 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:25Z","lastTransitionTime":"2025-10-03T14:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.186129 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.186196 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.186217 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.186243 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.186259 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:25Z","lastTransitionTime":"2025-10-03T14:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.288759 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.288817 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.288831 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.288850 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.288864 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:25Z","lastTransitionTime":"2025-10-03T14:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.298565 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:44:25 crc kubenswrapper[4774]: E1003 14:44:25.298704 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.396490 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.396548 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.396558 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.396573 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.396584 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:25Z","lastTransitionTime":"2025-10-03T14:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.498920 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.498983 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.498998 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.499016 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.499028 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:25Z","lastTransitionTime":"2025-10-03T14:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.602142 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.602227 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.602245 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.602269 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.602286 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:25Z","lastTransitionTime":"2025-10-03T14:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.705662 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.705702 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.705713 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.705760 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.705772 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:25Z","lastTransitionTime":"2025-10-03T14:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.808170 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.808209 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.808217 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.808233 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.808243 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:25Z","lastTransitionTime":"2025-10-03T14:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.911199 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.911232 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.911241 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.911254 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:25 crc kubenswrapper[4774]: I1003 14:44:25.911263 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:25Z","lastTransitionTime":"2025-10-03T14:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.002192 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:44:26 crc kubenswrapper[4774]: E1003 14:44:26.002418 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:30.002353376 +0000 UTC m=+152.591556868 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.014731 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.014777 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.014788 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.014804 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.014815 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:26Z","lastTransitionTime":"2025-10-03T14:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.103525 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.103589 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.103618 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.103647 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:44:26 crc kubenswrapper[4774]: E1003 14:44:26.103702 4774 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 14:44:26 crc kubenswrapper[4774]: E1003 14:44:26.103834 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 14:44:26 crc kubenswrapper[4774]: E1003 14:44:26.103863 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 14:44:26 crc kubenswrapper[4774]: E1003 14:44:26.103859 4774 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 14:44:26 crc kubenswrapper[4774]: E1003 14:44:26.103877 4774 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:44:26 crc kubenswrapper[4774]: E1003 14:44:26.103971 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 14:45:30.103893881 +0000 UTC m=+152.693097373 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 14:44:26 crc kubenswrapper[4774]: E1003 14:44:26.104017 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 14:45:30.103996234 +0000 UTC m=+152.693199886 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 14:44:26 crc kubenswrapper[4774]: E1003 14:44:26.104049 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 14:45:30.104032925 +0000 UTC m=+152.693236427 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:44:26 crc kubenswrapper[4774]: E1003 14:44:26.104071 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 14:44:26 crc kubenswrapper[4774]: E1003 14:44:26.104118 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 14:44:26 crc kubenswrapper[4774]: E1003 14:44:26.104148 4774 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:44:26 crc kubenswrapper[4774]: E1003 14:44:26.104237 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 14:45:30.10421232 +0000 UTC m=+152.693415802 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.117329 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.117383 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.117393 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.117408 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.117421 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:26Z","lastTransitionTime":"2025-10-03T14:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.220003 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.220078 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.220101 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.220125 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.220143 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:26Z","lastTransitionTime":"2025-10-03T14:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.299123 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.299227 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.299228 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:44:26 crc kubenswrapper[4774]: E1003 14:44:26.299364 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:44:26 crc kubenswrapper[4774]: E1003 14:44:26.299460 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:44:26 crc kubenswrapper[4774]: E1003 14:44:26.299520 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.322228 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.322263 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.322272 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.322285 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.322299 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:26Z","lastTransitionTime":"2025-10-03T14:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.425183 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.425262 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.425324 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.425354 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.425414 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:26Z","lastTransitionTime":"2025-10-03T14:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.528844 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.528901 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.528912 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.528934 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.528947 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:26Z","lastTransitionTime":"2025-10-03T14:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.631987 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.632058 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.632077 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.632105 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.632124 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:26Z","lastTransitionTime":"2025-10-03T14:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.734849 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.734926 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.734949 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.734978 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.734999 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:26Z","lastTransitionTime":"2025-10-03T14:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.838686 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.838754 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.838790 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.838814 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.838834 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:26Z","lastTransitionTime":"2025-10-03T14:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.943043 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.943122 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.943135 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.943157 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:26 crc kubenswrapper[4774]: I1003 14:44:26.943170 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:26Z","lastTransitionTime":"2025-10-03T14:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.047211 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.047248 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.047259 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.047275 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.047285 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:27Z","lastTransitionTime":"2025-10-03T14:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.149745 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.149780 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.149790 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.149804 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.149816 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:27Z","lastTransitionTime":"2025-10-03T14:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.252608 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.252639 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.252655 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.252669 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.252677 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:27Z","lastTransitionTime":"2025-10-03T14:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.298613 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:44:27 crc kubenswrapper[4774]: E1003 14:44:27.298789 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.355496 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.355556 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.355573 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.355596 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.355615 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:27Z","lastTransitionTime":"2025-10-03T14:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.457706 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.457754 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.457771 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.457790 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.457803 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:27Z","lastTransitionTime":"2025-10-03T14:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.561490 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.561794 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.561807 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.561826 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.561838 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:27Z","lastTransitionTime":"2025-10-03T14:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.664864 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.664914 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.664930 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.664954 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.664973 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:27Z","lastTransitionTime":"2025-10-03T14:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.768219 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.768283 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.768301 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.768323 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.768339 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:27Z","lastTransitionTime":"2025-10-03T14:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.871643 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.871713 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.871753 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.871780 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.871803 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:27Z","lastTransitionTime":"2025-10-03T14:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.981264 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.981324 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.981334 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.981397 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:27 crc kubenswrapper[4774]: I1003 14:44:27.981409 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:27Z","lastTransitionTime":"2025-10-03T14:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.084994 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.085060 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.085082 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.085108 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.085127 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:28Z","lastTransitionTime":"2025-10-03T14:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.187523 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.187562 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.187571 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.187586 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.187594 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:28Z","lastTransitionTime":"2025-10-03T14:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.290003 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.290053 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.290070 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.290093 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.290110 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:28Z","lastTransitionTime":"2025-10-03T14:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.299351 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.299463 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:44:28 crc kubenswrapper[4774]: E1003 14:44:28.299522 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.299463 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:44:28 crc kubenswrapper[4774]: E1003 14:44:28.299648 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:44:28 crc kubenswrapper[4774]: E1003 14:44:28.299794 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.392337 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.392427 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.392446 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.392467 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.392480 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:28Z","lastTransitionTime":"2025-10-03T14:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.495178 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.495208 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.495219 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.495235 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.495248 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:28Z","lastTransitionTime":"2025-10-03T14:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.597788 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.597854 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.597868 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.597905 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.597918 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:28Z","lastTransitionTime":"2025-10-03T14:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.700994 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.701059 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.701072 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.701089 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.701101 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:28Z","lastTransitionTime":"2025-10-03T14:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.803528 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.803587 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.803605 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.803629 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.803645 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:28Z","lastTransitionTime":"2025-10-03T14:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.906301 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.906362 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.906407 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.906433 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:28 crc kubenswrapper[4774]: I1003 14:44:28.906451 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:28Z","lastTransitionTime":"2025-10-03T14:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.009042 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.009114 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.009132 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.009157 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.009177 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:29Z","lastTransitionTime":"2025-10-03T14:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.113893 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.113969 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.113987 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.114027 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.114045 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:29Z","lastTransitionTime":"2025-10-03T14:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.216702 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.216770 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.216790 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.216813 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.216830 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:29Z","lastTransitionTime":"2025-10-03T14:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.298711 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:44:29 crc kubenswrapper[4774]: E1003 14:44:29.298940 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.310164 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.314015 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17e01ed9b7f955272f2c5bb14a9624d7faeb0a4727698c093a2acc4e2da71af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:44:15Z\\\",\\\"message\\\":\\\"2025-10-03T14:43:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6f69d832-7344-4942-a14b-a5a9f788cb85\\\\n2025-10-03T14:43:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6f69d832-7344-4942-a14b-a5a9f788cb85 to /host/opt/cni/bin/\\\\n2025-10-03T14:43:30Z [verbose] multus-daemon started\\\\n2025-10-03T14:43:30Z [verbose] Readiness Indicator file check\\\\n2025-10-03T14:44:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.319982 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.320025 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.320044 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.320066 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.320084 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:29Z","lastTransitionTime":"2025-10-03T14:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.338627 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16e8c0249ee2448082160abe417f67a17fac176add76d00dd5e755c0f9a6a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.355851 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.373604 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5f4df4f-152d-4e0f-b040-8772ae05ccbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a653e60009bbcb51465ae267a42828fbf1e9863cc50c73ab2f3b490b5f1524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4623e559e31f88609bb4a406374a3ce997e19ca75d71bc291cb98ed209f1e2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e63db8eac87987c01cd81f4cc676a2bb7268ee091097ce8b1ed6a14bc3f2346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://272d3589475ea3f31e989b1c7326b814559a8c6cb5f9e5e042ffba28cb1b7c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://272d3589475ea3f31e989b1c7326b814559a8c6cb5f9e5e042ffba28cb1b7c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.394004 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.408834 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.419678 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.423645 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.423681 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.423693 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.423710 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.423722 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:29Z","lastTransitionTime":"2025-10-03T14:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.431275 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a01f2ab-7f7c-411c-b424-0d382dee6976\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b65b48ad13df81dd9ccb04fd8aeebbf976409d5d318f51a923c2dfc0cd8cc7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3803c97bfaf2ffb04a8a6924cb97ee022c7d34a38e1a50ce63a7b8f062fc901d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jlwtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.442981 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.456961 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://686819b674279f59619905ac4ba451ca79c97fc652a773307eca3fad2352ddda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.472186 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd06fa447f9a388f7cabafaff05c8d6dc9aedc161d2097f81855318c8b7712e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b330d308f14c0b3b3630e89fe3e43728e17dbc3fe78cf9cae365a3ab0474c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b575a360d2ee43bc2629635a4ffda77a1b0e1a171bbd99b76b725784be537e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae8d985dec947c332efa41abbc5c30a978f06942cc62a86ab45e90fc2e5d763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aaa74c70ca8d93d672246f3451771bffa1e304c88a453cbbe3a305a304954fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0d624981ca616d27057de09511f8a17b643686871e2e77c15bd1c65fba9c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5d7482646eb7f611642d40cc2246f4325e1fdc7d5b08eab0ac5777911a9b209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5d7482646eb7f611642d40cc2246f4325e1fdc7d5b08eab0ac5777911a9b209\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:44:20Z\\\",\\\"message\\\":\\\"n.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator-webhook]} name:Service_openshift-machine-api/machine-api-operator-webhook_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.254:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e4e4203e-87c7-4024-930a-5d6bdfe2bdde}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 14:44:20.331150 6849 ovnkube.go:599] Stopped ovnkube\\\\nI1003 14:44:20.331172 6849 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 14:44:20.331227 6849 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped a\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:44:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jzv75_openshift-ovn-kubernetes(01bef0c3-23b3-4f49-8c33-3f2ec7503b12)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac1b1bb4b30aa1d11558ff126ad87e4dd7b9aaf255cd4a1860aacdee30a0bb15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.481503 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ghf5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d3c89f-9fbd-4d50-840a-c5c78528c903\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ghf5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.494040 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.507774 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.524697 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.525532 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.525599 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.525622 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.525647 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.525663 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:29Z","lastTransitionTime":"2025-10-03T14:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.539327 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.553345 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.567539 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.628947 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.629083 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.629113 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.629142 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.629165 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:29Z","lastTransitionTime":"2025-10-03T14:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.731625 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.731715 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.731732 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.731753 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.731769 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:29Z","lastTransitionTime":"2025-10-03T14:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.834437 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.834489 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.834500 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.834515 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.834524 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:29Z","lastTransitionTime":"2025-10-03T14:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.937244 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.937290 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.937303 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.937323 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:29 crc kubenswrapper[4774]: I1003 14:44:29.937338 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:29Z","lastTransitionTime":"2025-10-03T14:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.040276 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.040346 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.040400 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.040433 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.040457 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:30Z","lastTransitionTime":"2025-10-03T14:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.144299 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.144472 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.144509 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.144543 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.144567 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:30Z","lastTransitionTime":"2025-10-03T14:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.247353 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.247452 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.247474 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.247498 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.247517 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:30Z","lastTransitionTime":"2025-10-03T14:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.299077 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.299077 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.299226 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:44:30 crc kubenswrapper[4774]: E1003 14:44:30.299447 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:44:30 crc kubenswrapper[4774]: E1003 14:44:30.299579 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:44:30 crc kubenswrapper[4774]: E1003 14:44:30.299665 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.350712 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.350761 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.350773 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.350792 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.350807 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:30Z","lastTransitionTime":"2025-10-03T14:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.454648 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.454710 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.454729 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.454754 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.454775 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:30Z","lastTransitionTime":"2025-10-03T14:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.557220 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.557259 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.557268 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.557282 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.557290 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:30Z","lastTransitionTime":"2025-10-03T14:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.659695 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.659741 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.659752 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.659770 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.659781 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:30Z","lastTransitionTime":"2025-10-03T14:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.762336 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.762369 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.762392 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.762405 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.762414 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:30Z","lastTransitionTime":"2025-10-03T14:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.863940 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.863985 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.863997 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.864010 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.864022 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:30Z","lastTransitionTime":"2025-10-03T14:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.966695 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.966740 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.966751 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.966770 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:30 crc kubenswrapper[4774]: I1003 14:44:30.966782 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:30Z","lastTransitionTime":"2025-10-03T14:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.069824 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.069890 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.069909 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.069933 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.069952 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:31Z","lastTransitionTime":"2025-10-03T14:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.173313 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.173419 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.173443 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.173469 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.173491 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:31Z","lastTransitionTime":"2025-10-03T14:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.276033 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.276112 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.276137 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.276168 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.276190 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:31Z","lastTransitionTime":"2025-10-03T14:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.298814 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:44:31 crc kubenswrapper[4774]: E1003 14:44:31.299024 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.378346 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.378447 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.378464 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.378488 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.378505 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:31Z","lastTransitionTime":"2025-10-03T14:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.480665 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.480698 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.480709 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.480725 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.480738 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:31Z","lastTransitionTime":"2025-10-03T14:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.583771 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.583837 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.583859 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.583886 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.583906 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:31Z","lastTransitionTime":"2025-10-03T14:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.686335 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.686368 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.686396 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.686413 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.686426 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:31Z","lastTransitionTime":"2025-10-03T14:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.788001 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.788043 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.788051 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.788065 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.788075 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:31Z","lastTransitionTime":"2025-10-03T14:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.890511 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.890548 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.890559 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.890575 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.890585 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:31Z","lastTransitionTime":"2025-10-03T14:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.992970 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.993009 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.993024 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.993044 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:31 crc kubenswrapper[4774]: I1003 14:44:31.993058 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:31Z","lastTransitionTime":"2025-10-03T14:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.094869 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.094913 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.094929 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.094964 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.094984 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:32Z","lastTransitionTime":"2025-10-03T14:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.197813 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.197876 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.197894 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.197919 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.197935 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:32Z","lastTransitionTime":"2025-10-03T14:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.299139 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.299174 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:44:32 crc kubenswrapper[4774]: E1003 14:44:32.299307 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.299343 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:44:32 crc kubenswrapper[4774]: E1003 14:44:32.299524 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:44:32 crc kubenswrapper[4774]: E1003 14:44:32.299720 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.300852 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.300884 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.300896 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.300909 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.300921 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:32Z","lastTransitionTime":"2025-10-03T14:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.404124 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.404214 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.404250 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.404282 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.404308 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:32Z","lastTransitionTime":"2025-10-03T14:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.506569 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.506641 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.506665 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.506701 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.506727 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:32Z","lastTransitionTime":"2025-10-03T14:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.623436 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.623464 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.623472 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.623484 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.623492 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:32Z","lastTransitionTime":"2025-10-03T14:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.725854 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.725903 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.725914 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.725933 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.725945 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:32Z","lastTransitionTime":"2025-10-03T14:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.828629 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.828672 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.828683 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.828698 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.828707 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:32Z","lastTransitionTime":"2025-10-03T14:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.931425 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.931455 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.931463 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.931475 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:32 crc kubenswrapper[4774]: I1003 14:44:32.931484 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:32Z","lastTransitionTime":"2025-10-03T14:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.034099 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.034143 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.034154 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.034169 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.034180 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:33Z","lastTransitionTime":"2025-10-03T14:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.138640 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.138721 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.138761 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.138792 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.138815 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:33Z","lastTransitionTime":"2025-10-03T14:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.242011 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.242073 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.242089 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.242112 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.242128 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:33Z","lastTransitionTime":"2025-10-03T14:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.299111 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:44:33 crc kubenswrapper[4774]: E1003 14:44:33.299339 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.344743 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.344805 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.344825 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.344850 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.344867 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:33Z","lastTransitionTime":"2025-10-03T14:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.448055 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.448103 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.448119 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.448140 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.448157 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:33Z","lastTransitionTime":"2025-10-03T14:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.550872 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.550910 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.550923 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.550940 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.550953 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:33Z","lastTransitionTime":"2025-10-03T14:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.654008 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.654043 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.654053 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.654067 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.654078 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:33Z","lastTransitionTime":"2025-10-03T14:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.757463 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.757499 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.757507 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.757524 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.757537 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:33Z","lastTransitionTime":"2025-10-03T14:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.860304 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.860349 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.860362 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.860409 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.860615 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:33Z","lastTransitionTime":"2025-10-03T14:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.963187 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.963261 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.963285 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.963315 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:33 crc kubenswrapper[4774]: I1003 14:44:33.963339 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:33Z","lastTransitionTime":"2025-10-03T14:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.065961 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.066010 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.066028 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.066048 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.066063 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:34Z","lastTransitionTime":"2025-10-03T14:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.156539 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.156596 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.156621 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.156643 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.156658 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:34Z","lastTransitionTime":"2025-10-03T14:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:34 crc kubenswrapper[4774]: E1003 14:44:34.170628 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:34Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.175283 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.175338 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.175348 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.175364 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.175392 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:34Z","lastTransitionTime":"2025-10-03T14:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:34 crc kubenswrapper[4774]: E1003 14:44:34.190007 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:34Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.192873 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.192927 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.192937 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.192950 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.192959 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:34Z","lastTransitionTime":"2025-10-03T14:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:34 crc kubenswrapper[4774]: E1003 14:44:34.207424 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:34Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.211526 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.211563 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.211576 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.211593 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.211605 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:34Z","lastTransitionTime":"2025-10-03T14:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:34 crc kubenswrapper[4774]: E1003 14:44:34.227075 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:34Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.230934 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.230972 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.230982 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.230996 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.231005 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:34Z","lastTransitionTime":"2025-10-03T14:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:34 crc kubenswrapper[4774]: E1003 14:44:34.247759 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1c19104c-b8fa-49cf-91e0-46a9a8f59ee9\\\",\\\"systemUUID\\\":\\\"58c6b0c8-e5e6-4c5b-94e2-7ea0f5ff6464\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:34Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:34 crc kubenswrapper[4774]: E1003 14:44:34.247903 4774 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.249303 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.249363 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.249409 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.249435 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.249452 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:34Z","lastTransitionTime":"2025-10-03T14:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.299292 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.299385 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.299520 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:44:34 crc kubenswrapper[4774]: E1003 14:44:34.299605 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:44:34 crc kubenswrapper[4774]: E1003 14:44:34.299725 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:44:34 crc kubenswrapper[4774]: E1003 14:44:34.299767 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.352687 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.352753 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.352771 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.352798 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.352857 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:34Z","lastTransitionTime":"2025-10-03T14:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.456016 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.456059 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.456069 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.456083 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.456093 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:34Z","lastTransitionTime":"2025-10-03T14:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.559060 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.559115 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.559132 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.559154 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.559171 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:34Z","lastTransitionTime":"2025-10-03T14:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.661717 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.661767 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.661780 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.661798 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.661810 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:34Z","lastTransitionTime":"2025-10-03T14:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.764981 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.765320 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.765339 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.765361 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.765410 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:34Z","lastTransitionTime":"2025-10-03T14:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.867430 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.867465 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.867476 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.867489 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.867498 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:34Z","lastTransitionTime":"2025-10-03T14:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.970218 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.970256 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.970265 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.970278 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:34 crc kubenswrapper[4774]: I1003 14:44:34.970286 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:34Z","lastTransitionTime":"2025-10-03T14:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.076469 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.076547 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.076561 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.076585 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.076600 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:35Z","lastTransitionTime":"2025-10-03T14:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.179612 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.179660 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.179673 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.179691 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.179703 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:35Z","lastTransitionTime":"2025-10-03T14:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.282680 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.282757 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.282781 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.282811 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.282836 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:35Z","lastTransitionTime":"2025-10-03T14:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.298769 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:44:35 crc kubenswrapper[4774]: E1003 14:44:35.299229 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.299626 4774 scope.go:117] "RemoveContainer" containerID="d5d7482646eb7f611642d40cc2246f4325e1fdc7d5b08eab0ac5777911a9b209" Oct 03 14:44:35 crc kubenswrapper[4774]: E1003 14:44:35.299819 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jzv75_openshift-ovn-kubernetes(01bef0c3-23b3-4f49-8c33-3f2ec7503b12)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.386460 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.386534 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.386558 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.386582 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.386600 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:35Z","lastTransitionTime":"2025-10-03T14:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.489498 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.489558 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.489573 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.489592 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.489604 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:35Z","lastTransitionTime":"2025-10-03T14:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.593222 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.593312 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.593352 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.593435 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.593462 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:35Z","lastTransitionTime":"2025-10-03T14:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.697038 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.697114 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.697138 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.697170 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.697193 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:35Z","lastTransitionTime":"2025-10-03T14:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.799455 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.799497 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.799508 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.799524 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.799536 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:35Z","lastTransitionTime":"2025-10-03T14:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.902034 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.902114 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.902139 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.902175 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:35 crc kubenswrapper[4774]: I1003 14:44:35.902199 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:35Z","lastTransitionTime":"2025-10-03T14:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.005076 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.005136 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.005148 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.005166 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.005177 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:36Z","lastTransitionTime":"2025-10-03T14:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.108830 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.108899 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.109225 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.109262 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.109284 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:36Z","lastTransitionTime":"2025-10-03T14:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.212060 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.212131 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.212150 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.212175 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.212193 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:36Z","lastTransitionTime":"2025-10-03T14:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.299356 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.299356 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:44:36 crc kubenswrapper[4774]: E1003 14:44:36.299642 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:44:36 crc kubenswrapper[4774]: E1003 14:44:36.299676 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.299392 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:44:36 crc kubenswrapper[4774]: E1003 14:44:36.299895 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.315192 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.315268 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.315291 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.315319 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.315340 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:36Z","lastTransitionTime":"2025-10-03T14:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.417978 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.418030 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.418047 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.418068 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.418087 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:36Z","lastTransitionTime":"2025-10-03T14:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.521840 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.521899 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.521918 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.521943 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.521964 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:36Z","lastTransitionTime":"2025-10-03T14:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.624064 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.624107 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.624118 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.624132 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.624143 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:36Z","lastTransitionTime":"2025-10-03T14:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.726818 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.726885 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.726907 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.726926 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.726942 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:36Z","lastTransitionTime":"2025-10-03T14:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.829678 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.829716 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.829727 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.829741 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.829752 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:36Z","lastTransitionTime":"2025-10-03T14:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.932060 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.932103 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.932117 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.932133 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:36 crc kubenswrapper[4774]: I1003 14:44:36.932148 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:36Z","lastTransitionTime":"2025-10-03T14:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.035190 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.035245 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.035262 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.035289 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.035308 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:37Z","lastTransitionTime":"2025-10-03T14:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.137685 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.137718 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.137727 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.137741 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.137749 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:37Z","lastTransitionTime":"2025-10-03T14:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.240510 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.240572 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.240590 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.240612 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.240628 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:37Z","lastTransitionTime":"2025-10-03T14:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.298711 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:44:37 crc kubenswrapper[4774]: E1003 14:44:37.298835 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.343569 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.343612 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.343623 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.343640 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.343652 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:37Z","lastTransitionTime":"2025-10-03T14:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.446746 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.446782 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.446792 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.446832 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.446863 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:37Z","lastTransitionTime":"2025-10-03T14:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.552263 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.552325 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.552352 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.552392 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.552406 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:37Z","lastTransitionTime":"2025-10-03T14:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.654679 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.654717 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.654728 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.654743 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.654752 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:37Z","lastTransitionTime":"2025-10-03T14:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.756811 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.756872 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.756888 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.756910 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.756925 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:37Z","lastTransitionTime":"2025-10-03T14:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.859603 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.859695 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.859720 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.859755 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.859779 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:37Z","lastTransitionTime":"2025-10-03T14:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.963499 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.963571 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.963585 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.963606 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:37 crc kubenswrapper[4774]: I1003 14:44:37.963643 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:37Z","lastTransitionTime":"2025-10-03T14:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.065898 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.065963 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.065974 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.065990 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.066002 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:38Z","lastTransitionTime":"2025-10-03T14:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.168171 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.168206 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.168216 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.168233 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.168249 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:38Z","lastTransitionTime":"2025-10-03T14:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.271437 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.271491 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.271503 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.271519 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.271530 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:38Z","lastTransitionTime":"2025-10-03T14:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.298938 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.298938 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.298968 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:44:38 crc kubenswrapper[4774]: E1003 14:44:38.299121 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:44:38 crc kubenswrapper[4774]: E1003 14:44:38.299279 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:44:38 crc kubenswrapper[4774]: E1003 14:44:38.299508 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.374499 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.374539 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.374549 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.374564 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.374574 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:38Z","lastTransitionTime":"2025-10-03T14:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.478057 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.478106 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.478116 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.478130 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.478139 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:38Z","lastTransitionTime":"2025-10-03T14:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.581797 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.581884 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.581920 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.581949 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.581971 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:38Z","lastTransitionTime":"2025-10-03T14:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.684957 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.685019 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.685036 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.685059 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.685076 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:38Z","lastTransitionTime":"2025-10-03T14:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.788093 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.788156 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.788174 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.788197 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.788213 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:38Z","lastTransitionTime":"2025-10-03T14:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.891606 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.891662 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.891676 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.891697 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.891709 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:38Z","lastTransitionTime":"2025-10-03T14:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.995029 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.995130 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.995148 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.995171 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:38 crc kubenswrapper[4774]: I1003 14:44:38.995192 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:38Z","lastTransitionTime":"2025-10-03T14:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.097774 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.097848 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.097871 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.097898 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.097917 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:39Z","lastTransitionTime":"2025-10-03T14:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.200656 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.200726 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.200751 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.200783 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.200806 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:39Z","lastTransitionTime":"2025-10-03T14:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.298821 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:44:39 crc kubenswrapper[4774]: E1003 14:44:39.298996 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.303199 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.303234 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.303244 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.303285 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.303296 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:39Z","lastTransitionTime":"2025-10-03T14:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.312029 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://806e8200bbe77fe56e132a95d6dac697f4d58c06753197dfc8c8a5e1dc352e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:39Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.326897 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:39Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.340854 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:39Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.353535 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdf3c844c8140577f4a12b0bbeaace27dd459ae2fc2dad16383bafe74239ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:39Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.365336 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:39Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.375798 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca37ac4b-f421-4198-a179-12901d36f0f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ea15ebb8d160352352701d64bc6a27907327b9d94e4003cade95d3c48ff923a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpr59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6v5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:39Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.386470 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rsftk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"710a59a2-4b99-43b8-89b7-a7a1c8723d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f1fde10fae4f9ed070b3b5cade7035a11dddbeb7a284e21ec5b82c8da1fd986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbvhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rsftk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:39Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.399128 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5f4df4f-152d-4e0f-b040-8772ae05ccbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a653e60009bbcb51465ae267a42828fbf1e9863cc50c73ab2f3b490b5f1524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4623e559e31f88609bb4a406374a3ce997e19ca75d71bc291cb98ed209f1e2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e63db8eac87987c01cd81f4cc676a2bb7268ee091097ce8b1ed6a14bc3f2346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://272d3589475ea3f31e989b1c7326b814559a8c6cb5f9e5e042ffba28cb1b7c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://272d3589475ea3f31e989b1c7326b814559a8c6cb5f9e5e042ffba28cb1b7c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:39Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.405759 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.405829 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.405845 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.405865 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.405879 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:39Z","lastTransitionTime":"2025-10-03T14:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.410646 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c704e3ab-f243-4860-9a68-3f6a26e477a5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c584f9a25d7b0b01f007c91f9a0ccca0af44f8a71f933e56cbdcaf38e316481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79d847934b0a91b6911f58bcf9853e8fe8caeb4c15c07a331a812aa662da7bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79d847934b0a91b6911f58bcf9853e8fe8caeb4c15c07a331a812aa662da7bd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:39Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.431789 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628a23cf-753a-4f4a-b455-658467615902\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702be5dfbabf3fffc4e742c9fcf6cefa52890d880d7a70b4a51e06261e713d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c4e22cfdff2133efa2d801c72705731661045b64a5f61f98f0e9ff016b009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f2881bed26846951f0e0719b394fd6346f506ecac05be5b56f2067c161c251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79f31c42938d9aeab464ab18ad43c9f77d25c16c28772f1d8ca19720d48fe687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10c95debd65843178955e876ab835c8a8e4d9e3eb7633cf27a19431e96856bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38e166c72f70deb2d3d63d645e444b2ade15e92aa1cad4a36404bbff5f7e733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40ade5af5898cd557866fa723fa9554ff63f4bff4c55ed56e206da5c06091fcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2501b26f7159ba46ac1f41e7e420ffb807ac75a6b504b271b51d8c0f4560c319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:39Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.445282 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86993f90c95e5e1ad34e53c99e2275e7d659848ec667b5845a0511f239d6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64aaea8922d4488a9d22d9eb1fa8750f87973c2b3aac676068414efe1afd64f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:39Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.457144 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wspzq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b8a9763-d221-4434-8349-cf961e825cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98a52de272510f2b01f7b293d31377cd029e569f6d4746d831976fab06331c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54bhg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wspzq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:39Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.470263 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jk5hb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:44:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a17e01ed9b7f955272f2c5bb14a9624d7faeb0a4727698c093a2acc4e2da71af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:44:15Z\\\",\\\"message\\\":\\\"2025-10-03T14:43:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6f69d832-7344-4942-a14b-a5a9f788cb85\\\\n2025-10-03T14:43:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6f69d832-7344-4942-a14b-a5a9f788cb85 to /host/opt/cni/bin/\\\\n2025-10-03T14:43:30Z [verbose] multus-daemon started\\\\n2025-10-03T14:43:30Z [verbose] Readiness Indicator file check\\\\n2025-10-03T14:44:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6dfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jk5hb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:39Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.483923 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd01f0ff-e4fd-4c0a-8ec1-c0e19eabad6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16e8c0249ee2448082160abe417f67a17fac176add76d00dd5e755c0f9a6a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba60850a0306e0d30d6863cd44fe9af8820c7e098ddf2378529aecd2971532c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6963749b04afdd2e503d16be0f9d529aa37f19df58dc63dde43eb4d0f54a3b16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c829fe60e5487b2bb4100b38d261b9151c1cabdd163d34c11dea216648491d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61bc0c988cb0d8ed5f8088a670c63f5ded45ab44fcff0ee5d91a8b3070c9affc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a47ff23626cf58139d68eb75784148d5be055680d2122ab190438fdc07f159a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf63a29cf197a8262e45ad4028653ef24eef2f38287482aa0a39b9cb8b23ba1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfn4b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5bs7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:39Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.496664 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a01f2ab-7f7c-411c-b424-0d382dee6976\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b65b48ad13df81dd9ccb04fd8aeebbf976409d5d318f51a923c2dfc0cd8cc7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3803c97bfaf2ffb04a8a6924cb97ee022c7d34a38e1a50ce63a7b8f062fc901d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqxln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jlwtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:39Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.513082 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e38cd387-2454-4092-af30-fe106050ff4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a08ec3ab06757102ecbbc95e9cc79eb71d865da5a65cd8f29e12f352ed3151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56108f8d576068284d101e0f67c69e0fa70cd319c0c8adf6dc6ef5b4ca740ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e067ad99063e0b51bbd523a4aa50db91d38297ebe45eca366daafe236af8dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d86543be9289c2457036f075031434ce5bdac8ae39588e9962a410b55bc55e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:39Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.514658 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.514709 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.514725 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.514746 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.514765 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:39Z","lastTransitionTime":"2025-10-03T14:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.530439 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96c267a4-853a-4ae7-9512-6b2eeaad54b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504dea3d2b1b87c18ec55adc418d28d11daed109a5d0beafdc51272eecd913a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7478f5e5f0fe19f471a1c93cb03a81ef258efa7824938f81b89f795b7ae133\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf214fe3a62bd6b0b087671e4a98172766327c91f6f97fb429f1d494a907c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://686819b674279f59619905ac4ba451ca79c97fc652a773307eca3fad2352ddda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b9878967028966b092a2f32cc162eb31383472c728c84b7f49c610892f5baa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:43:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 14:43:21.918193 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 14:43:21.918497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:43:21.920302 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693981106/tls.crt::/tmp/serving-cert-693981106/tls.key\\\\\\\"\\\\nI1003 14:43:22.187105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 14:43:22.192967 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 14:43:22.193030 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 14:43:22.193058 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 14:43:22.193068 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 14:43:22.201567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 14:43:22.201595 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201600 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 14:43:22.201604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 14:43:22.201608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 14:43:22.201611 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 14:43:22.201615 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 14:43:22.201789 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 14:43:22.202859 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ec292319ce11751e337f3742a07e7e65c7e646816a28e3322d1d47f1b51f5d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db4d5142274312e946799cb3282e24d6a61a8fff948c60c855196265e14164c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:39Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.552628 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd06fa447f9a388f7cabafaff05c8d6dc9aedc161d2097f81855318c8b7712e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b330d308f14c0b3b3630e89fe3e43728e17dbc3fe78cf9cae365a3ab0474c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b575a360d2ee43bc2629635a4ffda77a1b0e1a171bbd99b76b725784be537e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae8d985dec947c332efa41abbc5c30a978f06942cc62a86ab45e90fc2e5d763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aaa74c70ca8d93d672246f3451771bffa1e304c88a453cbbe3a305a304954fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0d624981ca616d27057de09511f8a17b643686871e2e77c15bd1c65fba9c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5d7482646eb7f611642d40cc2246f4325e1fdc7d5b08eab0ac5777911a9b209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5d7482646eb7f611642d40cc2246f4325e1fdc7d5b08eab0ac5777911a9b209\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:44:20Z\\\",\\\"message\\\":\\\"n.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator-webhook]} name:Service_openshift-machine-api/machine-api-operator-webhook_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.254:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e4e4203e-87c7-4024-930a-5d6bdfe2bdde}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 14:44:20.331150 6849 ovnkube.go:599] Stopped ovnkube\\\\nI1003 14:44:20.331172 6849 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 14:44:20.331227 6849 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped a\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:44:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jzv75_openshift-ovn-kubernetes(01bef0c3-23b3-4f49-8c33-3f2ec7503b12)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac1b1bb4b30aa1d11558ff126ad87e4dd7b9aaf255cd4a1860aacdee30a0bb15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:43:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:43:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp65q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jzv75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:39Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.569987 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ghf5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d3c89f-9fbd-4d50-840a-c5c78528c903\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnpd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:43:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ghf5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:44:39Z is after 2025-08-24T17:21:41Z" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.620190 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.620275 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.620289 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.620311 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.620327 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:39Z","lastTransitionTime":"2025-10-03T14:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.723082 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.723200 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.723219 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.723243 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.723262 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:39Z","lastTransitionTime":"2025-10-03T14:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.825654 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.825709 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.825725 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.825746 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.825761 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:39Z","lastTransitionTime":"2025-10-03T14:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.928188 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.928248 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.928270 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.928298 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:39 crc kubenswrapper[4774]: I1003 14:44:39.928320 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:39Z","lastTransitionTime":"2025-10-03T14:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.030819 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.030856 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.030869 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.030886 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.030898 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:40Z","lastTransitionTime":"2025-10-03T14:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.133633 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.133711 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.133725 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.133745 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.133758 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:40Z","lastTransitionTime":"2025-10-03T14:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.236188 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.236238 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.236251 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.236270 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.236287 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:40Z","lastTransitionTime":"2025-10-03T14:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.299168 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.299168 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.299188 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:44:40 crc kubenswrapper[4774]: E1003 14:44:40.299429 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:44:40 crc kubenswrapper[4774]: E1003 14:44:40.299573 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:44:40 crc kubenswrapper[4774]: E1003 14:44:40.299604 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.339159 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.339230 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.339255 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.339284 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.339306 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:40Z","lastTransitionTime":"2025-10-03T14:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.441082 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.441130 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.441139 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.441151 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.441161 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:40Z","lastTransitionTime":"2025-10-03T14:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.543473 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.543518 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.543529 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.543545 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.543559 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:40Z","lastTransitionTime":"2025-10-03T14:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.646199 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.646238 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.646249 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.646264 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.646276 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:40Z","lastTransitionTime":"2025-10-03T14:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.748330 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.748431 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.748456 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.748486 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.748511 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:40Z","lastTransitionTime":"2025-10-03T14:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.856276 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.856342 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.856361 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.856414 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.856435 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:40Z","lastTransitionTime":"2025-10-03T14:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.958585 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.958629 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.958641 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.958658 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:40 crc kubenswrapper[4774]: I1003 14:44:40.958671 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:40Z","lastTransitionTime":"2025-10-03T14:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.064774 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.064843 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.064855 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.064874 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.064893 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:41Z","lastTransitionTime":"2025-10-03T14:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.166680 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.166743 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.166761 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.166784 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.166802 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:41Z","lastTransitionTime":"2025-10-03T14:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.268839 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.268878 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.268888 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.268902 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.268913 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:41Z","lastTransitionTime":"2025-10-03T14:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.299585 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:44:41 crc kubenswrapper[4774]: E1003 14:44:41.299733 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.375157 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.375314 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.376086 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.376143 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.376163 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:41Z","lastTransitionTime":"2025-10-03T14:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.479534 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.479573 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.479582 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.479596 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.479606 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:41Z","lastTransitionTime":"2025-10-03T14:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.581565 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.581610 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.581622 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.581638 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.581840 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:41Z","lastTransitionTime":"2025-10-03T14:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.684823 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.684920 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.684944 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.684973 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.684995 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:41Z","lastTransitionTime":"2025-10-03T14:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.787996 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.788072 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.788110 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.788141 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.788163 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:41Z","lastTransitionTime":"2025-10-03T14:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.890291 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.890330 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.890406 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.890443 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.890464 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:41Z","lastTransitionTime":"2025-10-03T14:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.993442 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.993486 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.993498 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.993519 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:41 crc kubenswrapper[4774]: I1003 14:44:41.993531 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:41Z","lastTransitionTime":"2025-10-03T14:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.096028 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.096071 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.096081 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.096096 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.096108 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:42Z","lastTransitionTime":"2025-10-03T14:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.199242 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.199306 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.199324 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.199349 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.199365 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:42Z","lastTransitionTime":"2025-10-03T14:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.298354 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.298458 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.298491 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:44:42 crc kubenswrapper[4774]: E1003 14:44:42.298678 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:44:42 crc kubenswrapper[4774]: E1003 14:44:42.298944 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:44:42 crc kubenswrapper[4774]: E1003 14:44:42.298991 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.301648 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.301691 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.301704 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.301724 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.301739 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:42Z","lastTransitionTime":"2025-10-03T14:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.404610 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.404654 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.404666 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.404685 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.404696 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:42Z","lastTransitionTime":"2025-10-03T14:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.506887 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.506946 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.506964 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.506988 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.507005 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:42Z","lastTransitionTime":"2025-10-03T14:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.610219 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.610261 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.610274 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.610291 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.610303 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:42Z","lastTransitionTime":"2025-10-03T14:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.712554 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.712614 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.712642 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.712658 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.712669 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:42Z","lastTransitionTime":"2025-10-03T14:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.815438 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.815565 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.815583 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.815605 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.815621 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:42Z","lastTransitionTime":"2025-10-03T14:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.918411 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.918471 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.918491 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.918514 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:42 crc kubenswrapper[4774]: I1003 14:44:42.918530 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:42Z","lastTransitionTime":"2025-10-03T14:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.020895 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.020934 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.020945 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.020962 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.020974 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:43Z","lastTransitionTime":"2025-10-03T14:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.124496 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.124584 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.124599 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.124618 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.124632 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:43Z","lastTransitionTime":"2025-10-03T14:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.227675 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.227724 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.227739 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.227772 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.227786 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:43Z","lastTransitionTime":"2025-10-03T14:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.298756 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:44:43 crc kubenswrapper[4774]: E1003 14:44:43.298925 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.329700 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.329755 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.329804 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.329837 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.329856 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:43Z","lastTransitionTime":"2025-10-03T14:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.432487 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.432540 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.432557 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.432584 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.432601 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:43Z","lastTransitionTime":"2025-10-03T14:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.535801 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.535898 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.535925 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.535950 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.535968 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:43Z","lastTransitionTime":"2025-10-03T14:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.639463 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.639514 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.639525 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.639543 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.639554 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:43Z","lastTransitionTime":"2025-10-03T14:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.742575 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.742637 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.742655 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.742678 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.742693 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:43Z","lastTransitionTime":"2025-10-03T14:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.854184 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.854218 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.854226 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.854238 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.854247 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:43Z","lastTransitionTime":"2025-10-03T14:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.957287 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.957332 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.957347 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.957365 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:43 crc kubenswrapper[4774]: I1003 14:44:43.957407 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:43Z","lastTransitionTime":"2025-10-03T14:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.060743 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.060815 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.060833 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.060858 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.060883 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:44Z","lastTransitionTime":"2025-10-03T14:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.096680 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88d3c89f-9fbd-4d50-840a-c5c78528c903-metrics-certs\") pod \"network-metrics-daemon-ghf5t\" (UID: \"88d3c89f-9fbd-4d50-840a-c5c78528c903\") " pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:44:44 crc kubenswrapper[4774]: E1003 14:44:44.096974 4774 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 14:44:44 crc kubenswrapper[4774]: E1003 14:44:44.097100 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88d3c89f-9fbd-4d50-840a-c5c78528c903-metrics-certs podName:88d3c89f-9fbd-4d50-840a-c5c78528c903 nodeName:}" failed. No retries permitted until 2025-10-03 14:45:48.097071544 +0000 UTC m=+170.686275066 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/88d3c89f-9fbd-4d50-840a-c5c78528c903-metrics-certs") pod "network-metrics-daemon-ghf5t" (UID: "88d3c89f-9fbd-4d50-840a-c5c78528c903") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.164191 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.164243 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.164256 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.164276 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.164289 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:44Z","lastTransitionTime":"2025-10-03T14:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.267074 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.267120 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.267132 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.267149 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.267161 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:44Z","lastTransitionTime":"2025-10-03T14:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.298637 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.298704 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.298693 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:44:44 crc kubenswrapper[4774]: E1003 14:44:44.298987 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:44:44 crc kubenswrapper[4774]: E1003 14:44:44.299066 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:44:44 crc kubenswrapper[4774]: E1003 14:44:44.299194 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.369920 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.369983 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.370004 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.370031 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.370050 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:44Z","lastTransitionTime":"2025-10-03T14:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.472945 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.472977 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.472988 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.473005 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.473015 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:44Z","lastTransitionTime":"2025-10-03T14:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.504901 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.504958 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.504977 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.504999 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.505018 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:44:44Z","lastTransitionTime":"2025-10-03T14:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.581602 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zgwj"] Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.582217 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zgwj" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.584336 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.584794 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.585000 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.586565 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.604671 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jlwtx" podStartSLOduration=78.604656486 podStartE2EDuration="1m18.604656486s" podCreationTimestamp="2025-10-03 14:43:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:44:44.603857085 +0000 UTC m=+107.193060537" watchObservedRunningTime="2025-10-03 14:44:44.604656486 +0000 UTC m=+107.193859938" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.626921 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=78.626898013 podStartE2EDuration="1m18.626898013s" podCreationTimestamp="2025-10-03 14:43:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:44:44.626845521 +0000 UTC m=+107.216048984" watchObservedRunningTime="2025-10-03 14:44:44.626898013 +0000 UTC m=+107.216101465" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.640796 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=82.640779295 podStartE2EDuration="1m22.640779295s" podCreationTimestamp="2025-10-03 14:43:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:44:44.640647462 +0000 UTC m=+107.229850954" watchObservedRunningTime="2025-10-03 14:44:44.640779295 +0000 UTC m=+107.229982747" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.702862 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd8aab45-75a9-4321-84a4-f0c5555b7173-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-9zgwj\" (UID: \"cd8aab45-75a9-4321-84a4-f0c5555b7173\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zgwj" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.702912 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/cd8aab45-75a9-4321-84a4-f0c5555b7173-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-9zgwj\" (UID: \"cd8aab45-75a9-4321-84a4-f0c5555b7173\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zgwj" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.702932 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd8aab45-75a9-4321-84a4-f0c5555b7173-service-ca\") pod \"cluster-version-operator-5c965bbfc6-9zgwj\" (UID: \"cd8aab45-75a9-4321-84a4-f0c5555b7173\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zgwj" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.702985 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/cd8aab45-75a9-4321-84a4-f0c5555b7173-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-9zgwj\" (UID: \"cd8aab45-75a9-4321-84a4-f0c5555b7173\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zgwj" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.702999 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd8aab45-75a9-4321-84a4-f0c5555b7173-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-9zgwj\" (UID: \"cd8aab45-75a9-4321-84a4-f0c5555b7173\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zgwj" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.787253 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podStartSLOduration=79.787233506 podStartE2EDuration="1m19.787233506s" podCreationTimestamp="2025-10-03 14:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:44:44.771084892 +0000 UTC m=+107.360288344" watchObservedRunningTime="2025-10-03 14:44:44.787233506 +0000 UTC m=+107.376436958" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.800599 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-rsftk" podStartSLOduration=79.800579774 podStartE2EDuration="1m19.800579774s" podCreationTimestamp="2025-10-03 14:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:44:44.799652179 +0000 UTC m=+107.388855651" watchObservedRunningTime="2025-10-03 14:44:44.800579774 +0000 UTC m=+107.389783226" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.800711 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5bs7x" podStartSLOduration=79.800705457 podStartE2EDuration="1m19.800705457s" podCreationTimestamp="2025-10-03 14:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:44:44.787792081 +0000 UTC m=+107.376995533" watchObservedRunningTime="2025-10-03 14:44:44.800705457 +0000 UTC m=+107.389908909" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.803694 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/cd8aab45-75a9-4321-84a4-f0c5555b7173-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-9zgwj\" (UID: \"cd8aab45-75a9-4321-84a4-f0c5555b7173\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zgwj" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.803730 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd8aab45-75a9-4321-84a4-f0c5555b7173-service-ca\") pod \"cluster-version-operator-5c965bbfc6-9zgwj\" (UID: \"cd8aab45-75a9-4321-84a4-f0c5555b7173\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zgwj" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.803785 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/cd8aab45-75a9-4321-84a4-f0c5555b7173-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-9zgwj\" (UID: \"cd8aab45-75a9-4321-84a4-f0c5555b7173\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zgwj" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.803802 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd8aab45-75a9-4321-84a4-f0c5555b7173-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-9zgwj\" (UID: \"cd8aab45-75a9-4321-84a4-f0c5555b7173\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zgwj" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.803826 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd8aab45-75a9-4321-84a4-f0c5555b7173-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-9zgwj\" (UID: \"cd8aab45-75a9-4321-84a4-f0c5555b7173\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zgwj" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.804069 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/cd8aab45-75a9-4321-84a4-f0c5555b7173-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-9zgwj\" (UID: \"cd8aab45-75a9-4321-84a4-f0c5555b7173\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zgwj" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.804842 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd8aab45-75a9-4321-84a4-f0c5555b7173-service-ca\") pod \"cluster-version-operator-5c965bbfc6-9zgwj\" (UID: \"cd8aab45-75a9-4321-84a4-f0c5555b7173\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zgwj" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.804884 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/cd8aab45-75a9-4321-84a4-f0c5555b7173-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-9zgwj\" (UID: \"cd8aab45-75a9-4321-84a4-f0c5555b7173\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zgwj" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.813275 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd8aab45-75a9-4321-84a4-f0c5555b7173-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-9zgwj\" (UID: \"cd8aab45-75a9-4321-84a4-f0c5555b7173\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zgwj" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.824115 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=48.824096475 podStartE2EDuration="48.824096475s" podCreationTimestamp="2025-10-03 14:43:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:44:44.82353754 +0000 UTC m=+107.412740992" watchObservedRunningTime="2025-10-03 14:44:44.824096475 +0000 UTC m=+107.413299917" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.829358 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd8aab45-75a9-4321-84a4-f0c5555b7173-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-9zgwj\" (UID: \"cd8aab45-75a9-4321-84a4-f0c5555b7173\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zgwj" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.836310 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=15.836296582 podStartE2EDuration="15.836296582s" podCreationTimestamp="2025-10-03 14:44:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:44:44.836153769 +0000 UTC m=+107.425357221" watchObservedRunningTime="2025-10-03 14:44:44.836296582 +0000 UTC m=+107.425500024" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.858138 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=80.858123998 podStartE2EDuration="1m20.858123998s" podCreationTimestamp="2025-10-03 14:43:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:44:44.857340407 +0000 UTC m=+107.446543879" watchObservedRunningTime="2025-10-03 14:44:44.858123998 +0000 UTC m=+107.447327450" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.895823 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-wspzq" podStartSLOduration=79.89580893 podStartE2EDuration="1m19.89580893s" podCreationTimestamp="2025-10-03 14:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:44:44.882759009 +0000 UTC m=+107.471962481" watchObservedRunningTime="2025-10-03 14:44:44.89580893 +0000 UTC m=+107.485012382" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.895903 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-jk5hb" podStartSLOduration=79.895901052 podStartE2EDuration="1m19.895901052s" podCreationTimestamp="2025-10-03 14:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:44:44.894797562 +0000 UTC m=+107.484001024" watchObservedRunningTime="2025-10-03 14:44:44.895901052 +0000 UTC m=+107.485104494" Oct 03 14:44:44 crc kubenswrapper[4774]: I1003 14:44:44.900969 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zgwj" Oct 03 14:44:45 crc kubenswrapper[4774]: I1003 14:44:45.298882 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:44:45 crc kubenswrapper[4774]: E1003 14:44:45.299931 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:44:45 crc kubenswrapper[4774]: I1003 14:44:45.862763 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zgwj" event={"ID":"cd8aab45-75a9-4321-84a4-f0c5555b7173","Type":"ContainerStarted","Data":"d178e5c804ad7c012dab9585fca74378f639c9a4e4d6ddc877d217f1ef610529"} Oct 03 14:44:45 crc kubenswrapper[4774]: I1003 14:44:45.862839 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zgwj" event={"ID":"cd8aab45-75a9-4321-84a4-f0c5555b7173","Type":"ContainerStarted","Data":"1b32b29476395abe473eefb95b94bc0308c9ad1f28cc0a35959d928b2a420cc4"} Oct 03 14:44:46 crc kubenswrapper[4774]: I1003 14:44:46.298653 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:44:46 crc kubenswrapper[4774]: I1003 14:44:46.298677 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:44:46 crc kubenswrapper[4774]: E1003 14:44:46.299228 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:44:46 crc kubenswrapper[4774]: I1003 14:44:46.298721 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:44:46 crc kubenswrapper[4774]: E1003 14:44:46.299294 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:44:46 crc kubenswrapper[4774]: E1003 14:44:46.299774 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:44:47 crc kubenswrapper[4774]: I1003 14:44:47.299106 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:44:47 crc kubenswrapper[4774]: E1003 14:44:47.299736 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:44:47 crc kubenswrapper[4774]: I1003 14:44:47.299869 4774 scope.go:117] "RemoveContainer" containerID="d5d7482646eb7f611642d40cc2246f4325e1fdc7d5b08eab0ac5777911a9b209" Oct 03 14:44:47 crc kubenswrapper[4774]: E1003 14:44:47.299982 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jzv75_openshift-ovn-kubernetes(01bef0c3-23b3-4f49-8c33-3f2ec7503b12)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" Oct 03 14:44:48 crc kubenswrapper[4774]: I1003 14:44:48.298770 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:44:48 crc kubenswrapper[4774]: I1003 14:44:48.298798 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:44:48 crc kubenswrapper[4774]: I1003 14:44:48.298817 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:44:48 crc kubenswrapper[4774]: E1003 14:44:48.298972 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:44:48 crc kubenswrapper[4774]: E1003 14:44:48.298972 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:44:48 crc kubenswrapper[4774]: E1003 14:44:48.299051 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:44:49 crc kubenswrapper[4774]: I1003 14:44:49.298505 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:44:49 crc kubenswrapper[4774]: E1003 14:44:49.299580 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:44:50 crc kubenswrapper[4774]: I1003 14:44:50.298927 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:44:50 crc kubenswrapper[4774]: I1003 14:44:50.298971 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:44:50 crc kubenswrapper[4774]: I1003 14:44:50.299040 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:44:50 crc kubenswrapper[4774]: E1003 14:44:50.299178 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:44:50 crc kubenswrapper[4774]: E1003 14:44:50.299263 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:44:50 crc kubenswrapper[4774]: E1003 14:44:50.299438 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:44:51 crc kubenswrapper[4774]: I1003 14:44:51.298418 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:44:51 crc kubenswrapper[4774]: E1003 14:44:51.299335 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:44:52 crc kubenswrapper[4774]: I1003 14:44:52.298501 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:44:52 crc kubenswrapper[4774]: I1003 14:44:52.298578 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:44:52 crc kubenswrapper[4774]: E1003 14:44:52.298628 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:44:52 crc kubenswrapper[4774]: I1003 14:44:52.298644 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:44:52 crc kubenswrapper[4774]: E1003 14:44:52.298756 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:44:52 crc kubenswrapper[4774]: E1003 14:44:52.298867 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:44:53 crc kubenswrapper[4774]: I1003 14:44:53.299057 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:44:53 crc kubenswrapper[4774]: E1003 14:44:53.299204 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:44:54 crc kubenswrapper[4774]: I1003 14:44:54.298648 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:44:54 crc kubenswrapper[4774]: I1003 14:44:54.298713 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:44:54 crc kubenswrapper[4774]: E1003 14:44:54.298779 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:44:54 crc kubenswrapper[4774]: I1003 14:44:54.298660 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:44:54 crc kubenswrapper[4774]: E1003 14:44:54.298952 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:44:54 crc kubenswrapper[4774]: E1003 14:44:54.298980 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:44:55 crc kubenswrapper[4774]: I1003 14:44:55.298737 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:44:55 crc kubenswrapper[4774]: E1003 14:44:55.298995 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:44:56 crc kubenswrapper[4774]: I1003 14:44:56.298697 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:44:56 crc kubenswrapper[4774]: I1003 14:44:56.298717 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:44:56 crc kubenswrapper[4774]: E1003 14:44:56.298783 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:44:56 crc kubenswrapper[4774]: I1003 14:44:56.298722 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:44:56 crc kubenswrapper[4774]: E1003 14:44:56.298852 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:44:56 crc kubenswrapper[4774]: E1003 14:44:56.298912 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:44:57 crc kubenswrapper[4774]: I1003 14:44:57.298981 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:44:57 crc kubenswrapper[4774]: E1003 14:44:57.299121 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:44:58 crc kubenswrapper[4774]: I1003 14:44:58.299350 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:44:58 crc kubenswrapper[4774]: I1003 14:44:58.299439 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:44:58 crc kubenswrapper[4774]: I1003 14:44:58.299366 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:44:58 crc kubenswrapper[4774]: E1003 14:44:58.299647 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:44:58 crc kubenswrapper[4774]: E1003 14:44:58.299782 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:44:58 crc kubenswrapper[4774]: E1003 14:44:58.299931 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:44:59 crc kubenswrapper[4774]: E1003 14:44:59.272783 4774 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 03 14:44:59 crc kubenswrapper[4774]: I1003 14:44:59.298818 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:44:59 crc kubenswrapper[4774]: E1003 14:44:59.300043 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:44:59 crc kubenswrapper[4774]: I1003 14:44:59.300803 4774 scope.go:117] "RemoveContainer" containerID="d5d7482646eb7f611642d40cc2246f4325e1fdc7d5b08eab0ac5777911a9b209" Oct 03 14:44:59 crc kubenswrapper[4774]: E1003 14:44:59.300998 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jzv75_openshift-ovn-kubernetes(01bef0c3-23b3-4f49-8c33-3f2ec7503b12)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" Oct 03 14:44:59 crc kubenswrapper[4774]: E1003 14:44:59.640762 4774 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 03 14:45:00 crc kubenswrapper[4774]: I1003 14:45:00.299257 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:45:00 crc kubenswrapper[4774]: E1003 14:45:00.299420 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:45:00 crc kubenswrapper[4774]: I1003 14:45:00.299257 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:45:00 crc kubenswrapper[4774]: E1003 14:45:00.299505 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:45:00 crc kubenswrapper[4774]: I1003 14:45:00.299257 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:45:00 crc kubenswrapper[4774]: E1003 14:45:00.299576 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:45:01 crc kubenswrapper[4774]: I1003 14:45:01.298972 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:45:01 crc kubenswrapper[4774]: E1003 14:45:01.299138 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:45:01 crc kubenswrapper[4774]: I1003 14:45:01.914486 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jk5hb_4f2cc8dc-61c3-4a0b-8da3-b899094eaa53/kube-multus/1.log" Oct 03 14:45:01 crc kubenswrapper[4774]: I1003 14:45:01.915030 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jk5hb_4f2cc8dc-61c3-4a0b-8da3-b899094eaa53/kube-multus/0.log" Oct 03 14:45:01 crc kubenswrapper[4774]: I1003 14:45:01.915085 4774 generic.go:334] "Generic (PLEG): container finished" podID="4f2cc8dc-61c3-4a0b-8da3-b899094eaa53" containerID="a17e01ed9b7f955272f2c5bb14a9624d7faeb0a4727698c093a2acc4e2da71af" exitCode=1 Oct 03 14:45:01 crc kubenswrapper[4774]: I1003 14:45:01.915115 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jk5hb" event={"ID":"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53","Type":"ContainerDied","Data":"a17e01ed9b7f955272f2c5bb14a9624d7faeb0a4727698c093a2acc4e2da71af"} Oct 03 14:45:01 crc kubenswrapper[4774]: I1003 14:45:01.915153 4774 scope.go:117] "RemoveContainer" containerID="67cf96527fb24270a82233834761f41ff7669ca61fd5d8d10acbde2de9b1163b" Oct 03 14:45:01 crc kubenswrapper[4774]: I1003 14:45:01.915983 4774 scope.go:117] "RemoveContainer" containerID="a17e01ed9b7f955272f2c5bb14a9624d7faeb0a4727698c093a2acc4e2da71af" Oct 03 14:45:01 crc kubenswrapper[4774]: E1003 14:45:01.916506 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-jk5hb_openshift-multus(4f2cc8dc-61c3-4a0b-8da3-b899094eaa53)\"" pod="openshift-multus/multus-jk5hb" podUID="4f2cc8dc-61c3-4a0b-8da3-b899094eaa53" Oct 03 14:45:01 crc kubenswrapper[4774]: I1003 14:45:01.936270 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zgwj" podStartSLOduration=96.936242403 podStartE2EDuration="1m36.936242403s" podCreationTimestamp="2025-10-03 14:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:44:45.882502489 +0000 UTC m=+108.471705981" watchObservedRunningTime="2025-10-03 14:45:01.936242403 +0000 UTC m=+124.525445895" Oct 03 14:45:02 crc kubenswrapper[4774]: I1003 14:45:02.298733 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:45:02 crc kubenswrapper[4774]: I1003 14:45:02.298824 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:45:02 crc kubenswrapper[4774]: I1003 14:45:02.298770 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:45:02 crc kubenswrapper[4774]: E1003 14:45:02.299023 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:45:02 crc kubenswrapper[4774]: E1003 14:45:02.299081 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:45:02 crc kubenswrapper[4774]: E1003 14:45:02.299232 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:45:02 crc kubenswrapper[4774]: I1003 14:45:02.923038 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jk5hb_4f2cc8dc-61c3-4a0b-8da3-b899094eaa53/kube-multus/1.log" Oct 03 14:45:03 crc kubenswrapper[4774]: I1003 14:45:03.298640 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:45:03 crc kubenswrapper[4774]: E1003 14:45:03.298940 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:45:04 crc kubenswrapper[4774]: I1003 14:45:04.298902 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:45:04 crc kubenswrapper[4774]: I1003 14:45:04.298954 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:45:04 crc kubenswrapper[4774]: I1003 14:45:04.298918 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:45:04 crc kubenswrapper[4774]: E1003 14:45:04.299041 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:45:04 crc kubenswrapper[4774]: E1003 14:45:04.299105 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:45:04 crc kubenswrapper[4774]: E1003 14:45:04.299258 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:45:04 crc kubenswrapper[4774]: E1003 14:45:04.641853 4774 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 03 14:45:05 crc kubenswrapper[4774]: I1003 14:45:05.299330 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:45:05 crc kubenswrapper[4774]: E1003 14:45:05.299542 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:45:06 crc kubenswrapper[4774]: I1003 14:45:06.299051 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:45:06 crc kubenswrapper[4774]: E1003 14:45:06.299640 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:45:06 crc kubenswrapper[4774]: I1003 14:45:06.299770 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:45:06 crc kubenswrapper[4774]: I1003 14:45:06.299707 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:45:06 crc kubenswrapper[4774]: E1003 14:45:06.300347 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:45:06 crc kubenswrapper[4774]: E1003 14:45:06.300580 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:45:07 crc kubenswrapper[4774]: I1003 14:45:07.298820 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:45:07 crc kubenswrapper[4774]: E1003 14:45:07.298944 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:45:08 crc kubenswrapper[4774]: I1003 14:45:08.298496 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:45:08 crc kubenswrapper[4774]: E1003 14:45:08.298630 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:45:08 crc kubenswrapper[4774]: I1003 14:45:08.298809 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:45:08 crc kubenswrapper[4774]: E1003 14:45:08.298868 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:45:08 crc kubenswrapper[4774]: I1003 14:45:08.298979 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:45:08 crc kubenswrapper[4774]: E1003 14:45:08.299032 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:45:09 crc kubenswrapper[4774]: I1003 14:45:09.300430 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:45:09 crc kubenswrapper[4774]: E1003 14:45:09.300644 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:45:09 crc kubenswrapper[4774]: E1003 14:45:09.642366 4774 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 03 14:45:10 crc kubenswrapper[4774]: I1003 14:45:10.298776 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:45:10 crc kubenswrapper[4774]: I1003 14:45:10.298832 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:45:10 crc kubenswrapper[4774]: I1003 14:45:10.298871 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:45:10 crc kubenswrapper[4774]: E1003 14:45:10.298986 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:45:10 crc kubenswrapper[4774]: E1003 14:45:10.299163 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:45:10 crc kubenswrapper[4774]: E1003 14:45:10.299241 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:45:11 crc kubenswrapper[4774]: I1003 14:45:11.299129 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:45:11 crc kubenswrapper[4774]: E1003 14:45:11.299279 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:45:12 crc kubenswrapper[4774]: I1003 14:45:12.298774 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:45:12 crc kubenswrapper[4774]: I1003 14:45:12.298859 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:45:12 crc kubenswrapper[4774]: E1003 14:45:12.298948 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:45:12 crc kubenswrapper[4774]: I1003 14:45:12.298966 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:45:12 crc kubenswrapper[4774]: E1003 14:45:12.299323 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:45:12 crc kubenswrapper[4774]: E1003 14:45:12.299361 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:45:13 crc kubenswrapper[4774]: I1003 14:45:13.299583 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:45:13 crc kubenswrapper[4774]: I1003 14:45:13.299837 4774 scope.go:117] "RemoveContainer" containerID="d5d7482646eb7f611642d40cc2246f4325e1fdc7d5b08eab0ac5777911a9b209" Oct 03 14:45:13 crc kubenswrapper[4774]: E1003 14:45:13.299992 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:45:13 crc kubenswrapper[4774]: I1003 14:45:13.960607 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzv75_01bef0c3-23b3-4f49-8c33-3f2ec7503b12/ovnkube-controller/3.log" Oct 03 14:45:13 crc kubenswrapper[4774]: I1003 14:45:13.963073 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" event={"ID":"01bef0c3-23b3-4f49-8c33-3f2ec7503b12","Type":"ContainerStarted","Data":"3d421c202f552f64422b854f56305a823b6051acede4800cdb123010bdd2af47"} Oct 03 14:45:13 crc kubenswrapper[4774]: I1003 14:45:13.963473 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:45:13 crc kubenswrapper[4774]: I1003 14:45:13.988153 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" podStartSLOduration=108.988135652 podStartE2EDuration="1m48.988135652s" podCreationTimestamp="2025-10-03 14:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:13.987336617 +0000 UTC m=+136.576540089" watchObservedRunningTime="2025-10-03 14:45:13.988135652 +0000 UTC m=+136.577339104" Oct 03 14:45:14 crc kubenswrapper[4774]: I1003 14:45:14.159122 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ghf5t"] Oct 03 14:45:14 crc kubenswrapper[4774]: I1003 14:45:14.159216 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:45:14 crc kubenswrapper[4774]: E1003 14:45:14.159299 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:45:14 crc kubenswrapper[4774]: I1003 14:45:14.298742 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:45:14 crc kubenswrapper[4774]: I1003 14:45:14.298847 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:45:14 crc kubenswrapper[4774]: I1003 14:45:14.298857 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:45:14 crc kubenswrapper[4774]: E1003 14:45:14.299484 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:45:14 crc kubenswrapper[4774]: E1003 14:45:14.299700 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:45:14 crc kubenswrapper[4774]: E1003 14:45:14.299816 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:45:14 crc kubenswrapper[4774]: E1003 14:45:14.644533 4774 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 03 14:45:15 crc kubenswrapper[4774]: I1003 14:45:15.299281 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:45:15 crc kubenswrapper[4774]: E1003 14:45:15.299437 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:45:16 crc kubenswrapper[4774]: I1003 14:45:16.299100 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:45:16 crc kubenswrapper[4774]: I1003 14:45:16.299195 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:45:16 crc kubenswrapper[4774]: E1003 14:45:16.299292 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:45:16 crc kubenswrapper[4774]: I1003 14:45:16.299220 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:45:16 crc kubenswrapper[4774]: E1003 14:45:16.299491 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:45:16 crc kubenswrapper[4774]: E1003 14:45:16.299535 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:45:17 crc kubenswrapper[4774]: I1003 14:45:17.299158 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:45:17 crc kubenswrapper[4774]: E1003 14:45:17.299518 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:45:17 crc kubenswrapper[4774]: I1003 14:45:17.299722 4774 scope.go:117] "RemoveContainer" containerID="a17e01ed9b7f955272f2c5bb14a9624d7faeb0a4727698c093a2acc4e2da71af" Oct 03 14:45:17 crc kubenswrapper[4774]: I1003 14:45:17.979656 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jk5hb_4f2cc8dc-61c3-4a0b-8da3-b899094eaa53/kube-multus/1.log" Oct 03 14:45:17 crc kubenswrapper[4774]: I1003 14:45:17.979711 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jk5hb" event={"ID":"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53","Type":"ContainerStarted","Data":"ac62cac6bdedfb0b9adab5e07f0bd50c6fbda746776d2da997f7537cd0c44d2a"} Oct 03 14:45:18 crc kubenswrapper[4774]: I1003 14:45:18.299027 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:45:18 crc kubenswrapper[4774]: I1003 14:45:18.299103 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:45:18 crc kubenswrapper[4774]: E1003 14:45:18.299164 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:45:18 crc kubenswrapper[4774]: E1003 14:45:18.299235 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:45:18 crc kubenswrapper[4774]: I1003 14:45:18.299035 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:45:18 crc kubenswrapper[4774]: E1003 14:45:18.299501 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:45:19 crc kubenswrapper[4774]: I1003 14:45:19.298940 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:45:19 crc kubenswrapper[4774]: E1003 14:45:19.301042 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ghf5t" podUID="88d3c89f-9fbd-4d50-840a-c5c78528c903" Oct 03 14:45:20 crc kubenswrapper[4774]: I1003 14:45:20.298544 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:45:20 crc kubenswrapper[4774]: I1003 14:45:20.298616 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:45:20 crc kubenswrapper[4774]: I1003 14:45:20.298616 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:45:20 crc kubenswrapper[4774]: I1003 14:45:20.301891 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 03 14:45:20 crc kubenswrapper[4774]: I1003 14:45:20.302157 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 03 14:45:20 crc kubenswrapper[4774]: I1003 14:45:20.302231 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 03 14:45:20 crc kubenswrapper[4774]: I1003 14:45:20.305844 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 03 14:45:21 crc kubenswrapper[4774]: I1003 14:45:21.298621 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:45:21 crc kubenswrapper[4774]: I1003 14:45:21.301890 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 03 14:45:21 crc kubenswrapper[4774]: I1003 14:45:21.301971 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 03 14:45:22 crc kubenswrapper[4774]: I1003 14:45:22.728290 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.433255 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.481502 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vpxwc"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.481947 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-kzdbd"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.482128 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vpxwc" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.483156 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7nrwv"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.483357 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.483885 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.484241 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7nrwv" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.484814 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.485462 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.485481 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.485587 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.486344 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrmzx"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.487109 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrmzx" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.487679 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pnvpf"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.489584 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pnvpf" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.490807 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-crfxb"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.492001 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crfxb" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.500833 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-7qr8w"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.501296 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-7qr8w" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.501895 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.502245 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.502673 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.502753 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.502954 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.503031 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.503093 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.503154 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.503243 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.503324 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.503364 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.503407 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.503332 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.503491 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.503521 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.503574 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.503580 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.503613 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.503708 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.503748 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.503823 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.503709 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.503866 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.503917 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.503717 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.504030 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.504101 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.504184 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.504240 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.504263 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.504367 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.504483 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.504492 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.504554 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.504583 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.504497 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.504717 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.504184 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.504885 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.504925 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.505482 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.505662 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.506052 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.506285 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.506751 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.509755 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l9c9f"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.511041 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-5nrkx"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.522512 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-8s9fl"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.522620 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-5nrkx" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.524208 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-jh8hv"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.524405 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8s9fl" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.528290 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-hw8fn"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.545052 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.546013 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l9c9f" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.547422 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-vnvw7"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.547781 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-p27cs"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.548421 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.549032 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-hw8fn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.549406 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-jh8hv" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.549664 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vnvw7" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.550807 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.551222 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.554438 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zh5qn"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.555116 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zh5qn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.555260 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.555562 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8vg7"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.556358 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8vg7" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.556600 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.557126 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4sjxw"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.557569 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.558010 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.558446 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.558605 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.558838 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.559256 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.559726 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.560165 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.560412 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.561195 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j7gl6"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.561292 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.561626 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.561643 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.561807 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j7gl6" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.561911 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.561927 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.562122 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.562181 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.562224 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.562295 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.562365 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.562493 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.562537 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.562562 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.562676 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.562676 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.562857 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.562973 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.563066 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.563275 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.563291 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.563344 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.563471 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.563475 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.563778 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.563796 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.563898 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.563906 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.564021 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.564161 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.565237 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gwcm9"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.565929 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-gwcm9" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.578460 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pdhbv"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.579047 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7z59k"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.579594 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7z59k" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.579604 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-pdnsv"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.579804 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pdhbv" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.580999 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.581121 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.585894 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.586084 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.587084 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.588130 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g6pd2"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.588704 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jmqgl"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.589053 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pdnsv" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.589268 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jmqgl" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.589429 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g6pd2" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.598719 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwml7"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.599856 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwml7" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.600406 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-vkxrc"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.620477 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vkxrc" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.622852 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0c1f48e-db09-49cd-82db-e687ea384d2c-config\") pod \"authentication-operator-69f744f599-5nrkx\" (UID: \"c0c1f48e-db09-49cd-82db-e687ea384d2c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5nrkx" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.622926 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b5a60df-c3e9-4553-92d2-0c8d8752d007-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-pnvpf\" (UID: \"3b5a60df-c3e9-4553-92d2-0c8d8752d007\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pnvpf" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.622969 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/936c81dd-638d-4776-86fe-54a8aa53e50e-config\") pod \"controller-manager-879f6c89f-vpxwc\" (UID: \"936c81dd-638d-4776-86fe-54a8aa53e50e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vpxwc" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.623007 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/004c9445-7b3b-4479-bf6e-e6d880e4c7bb-serving-cert\") pod \"route-controller-manager-6576b87f9c-rrmzx\" (UID: \"004c9445-7b3b-4479-bf6e-e6d880e4c7bb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrmzx" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.623037 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/352c3d22-5aaa-47c1-a712-9022736511f4-audit-policies\") pod \"apiserver-7bbb656c7d-zp5qn\" (UID: \"352c3d22-5aaa-47c1-a712-9022736511f4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.623064 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0c1f48e-db09-49cd-82db-e687ea384d2c-serving-cert\") pod \"authentication-operator-69f744f599-5nrkx\" (UID: \"c0c1f48e-db09-49cd-82db-e687ea384d2c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5nrkx" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.623097 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/98199f15-3582-4b64-a118-c97f9ddadd11-etcd-client\") pod \"apiserver-76f77b778f-kzdbd\" (UID: \"98199f15-3582-4b64-a118-c97f9ddadd11\") " pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.623123 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/352c3d22-5aaa-47c1-a712-9022736511f4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zp5qn\" (UID: \"352c3d22-5aaa-47c1-a712-9022736511f4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.623157 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/98199f15-3582-4b64-a118-c97f9ddadd11-audit\") pod \"apiserver-76f77b778f-kzdbd\" (UID: \"98199f15-3582-4b64-a118-c97f9ddadd11\") " pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.623188 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsk88\" (UniqueName: \"kubernetes.io/projected/40ce0e8a-b3ee-4b5a-a3ef-ee0e4a573496-kube-api-access-fsk88\") pod \"downloads-7954f5f757-jh8hv\" (UID: \"40ce0e8a-b3ee-4b5a-a3ef-ee0e4a573496\") " pod="openshift-console/downloads-7954f5f757-jh8hv" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.623223 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxs6v\" (UniqueName: \"kubernetes.io/projected/bd925ec4-97eb-4780-82a0-e7dc782542ce-kube-api-access-fxs6v\") pod \"console-operator-58897d9998-7qr8w\" (UID: \"bd925ec4-97eb-4780-82a0-e7dc782542ce\") " pod="openshift-console-operator/console-operator-58897d9998-7qr8w" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.623267 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a5b48e24-98c3-4875-9e6f-9d2de0463048-machine-approver-tls\") pod \"machine-approver-56656f9798-crfxb\" (UID: \"a5b48e24-98c3-4875-9e6f-9d2de0463048\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crfxb" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.623306 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd925ec4-97eb-4780-82a0-e7dc782542ce-trusted-ca\") pod \"console-operator-58897d9998-7qr8w\" (UID: \"bd925ec4-97eb-4780-82a0-e7dc782542ce\") " pod="openshift-console-operator/console-operator-58897d9998-7qr8w" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.623339 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/936c81dd-638d-4776-86fe-54a8aa53e50e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-vpxwc\" (UID: \"936c81dd-638d-4776-86fe-54a8aa53e50e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vpxwc" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.631831 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwmnx\" (UniqueName: \"kubernetes.io/projected/004c9445-7b3b-4479-bf6e-e6d880e4c7bb-kube-api-access-zwmnx\") pod \"route-controller-manager-6576b87f9c-rrmzx\" (UID: \"004c9445-7b3b-4479-bf6e-e6d880e4c7bb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrmzx" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.632004 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-972t2\" (UniqueName: \"kubernetes.io/projected/0d562068-166d-42b5-9f5f-3e33460f5410-kube-api-access-972t2\") pod \"openshift-config-operator-7777fb866f-8s9fl\" (UID: \"0d562068-166d-42b5-9f5f-3e33460f5410\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8s9fl" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.632102 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/98199f15-3582-4b64-a118-c97f9ddadd11-encryption-config\") pod \"apiserver-76f77b778f-kzdbd\" (UID: \"98199f15-3582-4b64-a118-c97f9ddadd11\") " pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.632133 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b5e417ed-5b5e-405b-8b95-ed27ddaef9ee-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7nrwv\" (UID: \"b5e417ed-5b5e-405b-8b95-ed27ddaef9ee\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7nrwv" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.632159 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/352c3d22-5aaa-47c1-a712-9022736511f4-serving-cert\") pod \"apiserver-7bbb656c7d-zp5qn\" (UID: \"352c3d22-5aaa-47c1-a712-9022736511f4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.632186 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a5b48e24-98c3-4875-9e6f-9d2de0463048-auth-proxy-config\") pod \"machine-approver-56656f9798-crfxb\" (UID: \"a5b48e24-98c3-4875-9e6f-9d2de0463048\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crfxb" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.632212 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5b48e24-98c3-4875-9e6f-9d2de0463048-config\") pod \"machine-approver-56656f9798-crfxb\" (UID: \"a5b48e24-98c3-4875-9e6f-9d2de0463048\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crfxb" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.632314 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/98199f15-3582-4b64-a118-c97f9ddadd11-node-pullsecrets\") pod \"apiserver-76f77b778f-kzdbd\" (UID: \"98199f15-3582-4b64-a118-c97f9ddadd11\") " pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.632348 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd925ec4-97eb-4780-82a0-e7dc782542ce-config\") pod \"console-operator-58897d9998-7qr8w\" (UID: \"bd925ec4-97eb-4780-82a0-e7dc782542ce\") " pod="openshift-console-operator/console-operator-58897d9998-7qr8w" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.632404 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcfcz\" (UniqueName: \"kubernetes.io/projected/936c81dd-638d-4776-86fe-54a8aa53e50e-kube-api-access-vcfcz\") pod \"controller-manager-879f6c89f-vpxwc\" (UID: \"936c81dd-638d-4776-86fe-54a8aa53e50e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vpxwc" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.632430 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b5a60df-c3e9-4553-92d2-0c8d8752d007-config\") pod \"openshift-apiserver-operator-796bbdcf4f-pnvpf\" (UID: \"3b5a60df-c3e9-4553-92d2-0c8d8752d007\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pnvpf" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.632454 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/936c81dd-638d-4776-86fe-54a8aa53e50e-serving-cert\") pod \"controller-manager-879f6c89f-vpxwc\" (UID: \"936c81dd-638d-4776-86fe-54a8aa53e50e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vpxwc" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.632481 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcmrb\" (UniqueName: \"kubernetes.io/projected/b5e417ed-5b5e-405b-8b95-ed27ddaef9ee-kube-api-access-gcmrb\") pod \"machine-api-operator-5694c8668f-7nrwv\" (UID: \"b5e417ed-5b5e-405b-8b95-ed27ddaef9ee\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7nrwv" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.632498 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxhpd\" (UniqueName: \"kubernetes.io/projected/352c3d22-5aaa-47c1-a712-9022736511f4-kube-api-access-kxhpd\") pod \"apiserver-7bbb656c7d-zp5qn\" (UID: \"352c3d22-5aaa-47c1-a712-9022736511f4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.632528 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0c1f48e-db09-49cd-82db-e687ea384d2c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-5nrkx\" (UID: \"c0c1f48e-db09-49cd-82db-e687ea384d2c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5nrkx" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.632553 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtmhw\" (UniqueName: \"kubernetes.io/projected/c0c1f48e-db09-49cd-82db-e687ea384d2c-kube-api-access-xtmhw\") pod \"authentication-operator-69f744f599-5nrkx\" (UID: \"c0c1f48e-db09-49cd-82db-e687ea384d2c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5nrkx" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.632574 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98199f15-3582-4b64-a118-c97f9ddadd11-trusted-ca-bundle\") pod \"apiserver-76f77b778f-kzdbd\" (UID: \"98199f15-3582-4b64-a118-c97f9ddadd11\") " pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.632590 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/352c3d22-5aaa-47c1-a712-9022736511f4-encryption-config\") pod \"apiserver-7bbb656c7d-zp5qn\" (UID: \"352c3d22-5aaa-47c1-a712-9022736511f4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.632618 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlrgl\" (UniqueName: \"kubernetes.io/projected/98199f15-3582-4b64-a118-c97f9ddadd11-kube-api-access-mlrgl\") pod \"apiserver-76f77b778f-kzdbd\" (UID: \"98199f15-3582-4b64-a118-c97f9ddadd11\") " pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.632636 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/352c3d22-5aaa-47c1-a712-9022736511f4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zp5qn\" (UID: \"352c3d22-5aaa-47c1-a712-9022736511f4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.632662 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/004c9445-7b3b-4479-bf6e-e6d880e4c7bb-client-ca\") pod \"route-controller-manager-6576b87f9c-rrmzx\" (UID: \"004c9445-7b3b-4479-bf6e-e6d880e4c7bb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrmzx" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.632686 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0c1f48e-db09-49cd-82db-e687ea384d2c-service-ca-bundle\") pod \"authentication-operator-69f744f599-5nrkx\" (UID: \"c0c1f48e-db09-49cd-82db-e687ea384d2c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5nrkx" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.632703 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/98199f15-3582-4b64-a118-c97f9ddadd11-audit-dir\") pod \"apiserver-76f77b778f-kzdbd\" (UID: \"98199f15-3582-4b64-a118-c97f9ddadd11\") " pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.632723 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/352c3d22-5aaa-47c1-a712-9022736511f4-audit-dir\") pod \"apiserver-7bbb656c7d-zp5qn\" (UID: \"352c3d22-5aaa-47c1-a712-9022736511f4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.632745 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5e417ed-5b5e-405b-8b95-ed27ddaef9ee-config\") pod \"machine-api-operator-5694c8668f-7nrwv\" (UID: \"b5e417ed-5b5e-405b-8b95-ed27ddaef9ee\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7nrwv" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.632763 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b5e417ed-5b5e-405b-8b95-ed27ddaef9ee-images\") pod \"machine-api-operator-5694c8668f-7nrwv\" (UID: \"b5e417ed-5b5e-405b-8b95-ed27ddaef9ee\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7nrwv" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.632780 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkqd7\" (UniqueName: \"kubernetes.io/projected/a5b48e24-98c3-4875-9e6f-9d2de0463048-kube-api-access-fkqd7\") pod \"machine-approver-56656f9798-crfxb\" (UID: \"a5b48e24-98c3-4875-9e6f-9d2de0463048\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crfxb" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.632800 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd925ec4-97eb-4780-82a0-e7dc782542ce-serving-cert\") pod \"console-operator-58897d9998-7qr8w\" (UID: \"bd925ec4-97eb-4780-82a0-e7dc782542ce\") " pod="openshift-console-operator/console-operator-58897d9998-7qr8w" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.632841 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/98199f15-3582-4b64-a118-c97f9ddadd11-etcd-serving-ca\") pod \"apiserver-76f77b778f-kzdbd\" (UID: \"98199f15-3582-4b64-a118-c97f9ddadd11\") " pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.632861 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/352c3d22-5aaa-47c1-a712-9022736511f4-etcd-client\") pod \"apiserver-7bbb656c7d-zp5qn\" (UID: \"352c3d22-5aaa-47c1-a712-9022736511f4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.632885 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e9be6b1c-e3c3-470e-9387-e2a6abd005aa-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-l9c9f\" (UID: \"e9be6b1c-e3c3-470e-9387-e2a6abd005aa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l9c9f" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.632904 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/98199f15-3582-4b64-a118-c97f9ddadd11-image-import-ca\") pod \"apiserver-76f77b778f-kzdbd\" (UID: \"98199f15-3582-4b64-a118-c97f9ddadd11\") " pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.632946 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/004c9445-7b3b-4479-bf6e-e6d880e4c7bb-config\") pod \"route-controller-manager-6576b87f9c-rrmzx\" (UID: \"004c9445-7b3b-4479-bf6e-e6d880e4c7bb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrmzx" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.632963 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vx9v\" (UniqueName: \"kubernetes.io/projected/e9be6b1c-e3c3-470e-9387-e2a6abd005aa-kube-api-access-6vx9v\") pod \"cluster-samples-operator-665b6dd947-l9c9f\" (UID: \"e9be6b1c-e3c3-470e-9387-e2a6abd005aa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l9c9f" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.632983 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98199f15-3582-4b64-a118-c97f9ddadd11-serving-cert\") pod \"apiserver-76f77b778f-kzdbd\" (UID: \"98199f15-3582-4b64-a118-c97f9ddadd11\") " pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.633004 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57phc\" (UniqueName: \"kubernetes.io/projected/3b5a60df-c3e9-4553-92d2-0c8d8752d007-kube-api-access-57phc\") pod \"openshift-apiserver-operator-796bbdcf4f-pnvpf\" (UID: \"3b5a60df-c3e9-4553-92d2-0c8d8752d007\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pnvpf" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.633026 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d562068-166d-42b5-9f5f-3e33460f5410-serving-cert\") pod \"openshift-config-operator-7777fb866f-8s9fl\" (UID: \"0d562068-166d-42b5-9f5f-3e33460f5410\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8s9fl" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.633048 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/936c81dd-638d-4776-86fe-54a8aa53e50e-client-ca\") pod \"controller-manager-879f6c89f-vpxwc\" (UID: \"936c81dd-638d-4776-86fe-54a8aa53e50e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vpxwc" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.633064 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0d562068-166d-42b5-9f5f-3e33460f5410-available-featuregates\") pod \"openshift-config-operator-7777fb866f-8s9fl\" (UID: \"0d562068-166d-42b5-9f5f-3e33460f5410\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8s9fl" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.633087 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98199f15-3582-4b64-a118-c97f9ddadd11-config\") pod \"apiserver-76f77b778f-kzdbd\" (UID: \"98199f15-3582-4b64-a118-c97f9ddadd11\") " pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.635137 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.639689 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7qxcv"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.652028 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7qxcv" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.652807 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.653077 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.654681 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jsbrr"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.656653 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.656978 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.657092 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.658187 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gwjdm"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.660455 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-s946h"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.660656 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jsbrr" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.660847 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-s946h" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.661044 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gwjdm" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.661945 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s7km8"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.662302 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s7km8" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.662684 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-wjcbg"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.663355 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-wjcbg" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.664048 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vpxwc"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.676396 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.677482 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.677517 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.681175 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7nrwv"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.681609 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-kzdbd"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.683020 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrmzx"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.683786 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.685193 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-hw8fn"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.687439 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.689234 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.692600 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l9c9f"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.693983 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.694656 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.700161 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-kcqsn"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.700835 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-kcqsn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.707799 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wq2sd"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.708409 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wq2sd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.710585 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-dz4j8"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.711152 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-dz4j8" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.711647 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.714431 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4rcc5"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.714964 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325045-8kfbc"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.715388 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-8kfbc" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.715401 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4rcc5" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.724676 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g6pd2"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.727173 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.732168 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8vg7"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.734155 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98199f15-3582-4b64-a118-c97f9ddadd11-config\") pod \"apiserver-76f77b778f-kzdbd\" (UID: \"98199f15-3582-4b64-a118-c97f9ddadd11\") " pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.734199 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6c0571f5-aaf0-4189-a20c-66f5b173ea49-metrics-tls\") pod \"ingress-operator-5b745b69d9-7z59k\" (UID: \"6c0571f5-aaf0-4189-a20c-66f5b173ea49\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7z59k" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.734227 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fd62\" (UniqueName: \"kubernetes.io/projected/6c0571f5-aaf0-4189-a20c-66f5b173ea49-kube-api-access-8fd62\") pod \"ingress-operator-5b745b69d9-7z59k\" (UID: \"6c0571f5-aaf0-4189-a20c-66f5b173ea49\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7z59k" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.734254 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sw2q\" (UniqueName: \"kubernetes.io/projected/40de7d67-3283-492a-bfc8-65c83c19421f-kube-api-access-6sw2q\") pod \"machine-config-controller-84d6567774-pdnsv\" (UID: \"40de7d67-3283-492a-bfc8-65c83c19421f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pdnsv" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.734295 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0c1f48e-db09-49cd-82db-e687ea384d2c-config\") pod \"authentication-operator-69f744f599-5nrkx\" (UID: \"c0c1f48e-db09-49cd-82db-e687ea384d2c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5nrkx" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.734312 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b5a60df-c3e9-4553-92d2-0c8d8752d007-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-pnvpf\" (UID: \"3b5a60df-c3e9-4553-92d2-0c8d8752d007\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pnvpf" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.734336 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/936c81dd-638d-4776-86fe-54a8aa53e50e-config\") pod \"controller-manager-879f6c89f-vpxwc\" (UID: \"936c81dd-638d-4776-86fe-54a8aa53e50e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vpxwc" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.734362 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-trusted-ca-bundle\") pod \"console-f9d7485db-vnvw7\" (UID: \"9cae35f2-fcf0-4014-9b5b-9887d416e8d3\") " pod="openshift-console/console-f9d7485db-vnvw7" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.734407 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-service-ca\") pod \"console-f9d7485db-vnvw7\" (UID: \"9cae35f2-fcf0-4014-9b5b-9887d416e8d3\") " pod="openshift-console/console-f9d7485db-vnvw7" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.734429 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrq22\" (UniqueName: \"kubernetes.io/projected/5a48eac5-1dc7-4478-ac71-084ad6302324-kube-api-access-xrq22\") pod \"kube-storage-version-migrator-operator-b67b599dd-j7gl6\" (UID: \"5a48eac5-1dc7-4478-ac71-084ad6302324\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j7gl6" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.734449 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/40de7d67-3283-492a-bfc8-65c83c19421f-proxy-tls\") pod \"machine-config-controller-84d6567774-pdnsv\" (UID: \"40de7d67-3283-492a-bfc8-65c83c19421f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pdnsv" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.734472 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/004c9445-7b3b-4479-bf6e-e6d880e4c7bb-serving-cert\") pod \"route-controller-manager-6576b87f9c-rrmzx\" (UID: \"004c9445-7b3b-4479-bf6e-e6d880e4c7bb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrmzx" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.734493 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/352c3d22-5aaa-47c1-a712-9022736511f4-audit-policies\") pod \"apiserver-7bbb656c7d-zp5qn\" (UID: \"352c3d22-5aaa-47c1-a712-9022736511f4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.734513 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9621ba12-6602-4afb-80bb-116e84daef13-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pdhbv\" (UID: \"9621ba12-6602-4afb-80bb-116e84daef13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pdhbv" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.734531 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0c1f48e-db09-49cd-82db-e687ea384d2c-serving-cert\") pod \"authentication-operator-69f744f599-5nrkx\" (UID: \"c0c1f48e-db09-49cd-82db-e687ea384d2c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5nrkx" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.734549 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/98199f15-3582-4b64-a118-c97f9ddadd11-etcd-client\") pod \"apiserver-76f77b778f-kzdbd\" (UID: \"98199f15-3582-4b64-a118-c97f9ddadd11\") " pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.734641 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/352c3d22-5aaa-47c1-a712-9022736511f4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zp5qn\" (UID: \"352c3d22-5aaa-47c1-a712-9022736511f4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.734665 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-oauth-serving-cert\") pod \"console-f9d7485db-vnvw7\" (UID: \"9cae35f2-fcf0-4014-9b5b-9887d416e8d3\") " pod="openshift-console/console-f9d7485db-vnvw7" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.734683 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/98199f15-3582-4b64-a118-c97f9ddadd11-audit\") pod \"apiserver-76f77b778f-kzdbd\" (UID: \"98199f15-3582-4b64-a118-c97f9ddadd11\") " pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.734704 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsk88\" (UniqueName: \"kubernetes.io/projected/40ce0e8a-b3ee-4b5a-a3ef-ee0e4a573496-kube-api-access-fsk88\") pod \"downloads-7954f5f757-jh8hv\" (UID: \"40ce0e8a-b3ee-4b5a-a3ef-ee0e4a573496\") " pod="openshift-console/downloads-7954f5f757-jh8hv" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.734729 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c0571f5-aaf0-4189-a20c-66f5b173ea49-trusted-ca\") pod \"ingress-operator-5b745b69d9-7z59k\" (UID: \"6c0571f5-aaf0-4189-a20c-66f5b173ea49\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7z59k" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.734753 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxs6v\" (UniqueName: \"kubernetes.io/projected/bd925ec4-97eb-4780-82a0-e7dc782542ce-kube-api-access-fxs6v\") pod \"console-operator-58897d9998-7qr8w\" (UID: \"bd925ec4-97eb-4780-82a0-e7dc782542ce\") " pod="openshift-console-operator/console-operator-58897d9998-7qr8w" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.734783 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a5b48e24-98c3-4875-9e6f-9d2de0463048-machine-approver-tls\") pod \"machine-approver-56656f9798-crfxb\" (UID: \"a5b48e24-98c3-4875-9e6f-9d2de0463048\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crfxb" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.734803 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd925ec4-97eb-4780-82a0-e7dc782542ce-trusted-ca\") pod \"console-operator-58897d9998-7qr8w\" (UID: \"bd925ec4-97eb-4780-82a0-e7dc782542ce\") " pod="openshift-console-operator/console-operator-58897d9998-7qr8w" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.734824 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/936c81dd-638d-4776-86fe-54a8aa53e50e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-vpxwc\" (UID: \"936c81dd-638d-4776-86fe-54a8aa53e50e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vpxwc" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.734842 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9621ba12-6602-4afb-80bb-116e84daef13-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pdhbv\" (UID: \"9621ba12-6602-4afb-80bb-116e84daef13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pdhbv" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.734864 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwmnx\" (UniqueName: \"kubernetes.io/projected/004c9445-7b3b-4479-bf6e-e6d880e4c7bb-kube-api-access-zwmnx\") pod \"route-controller-manager-6576b87f9c-rrmzx\" (UID: \"004c9445-7b3b-4479-bf6e-e6d880e4c7bb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrmzx" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.734884 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-972t2\" (UniqueName: \"kubernetes.io/projected/0d562068-166d-42b5-9f5f-3e33460f5410-kube-api-access-972t2\") pod \"openshift-config-operator-7777fb866f-8s9fl\" (UID: \"0d562068-166d-42b5-9f5f-3e33460f5410\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8s9fl" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.734905 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a48eac5-1dc7-4478-ac71-084ad6302324-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-j7gl6\" (UID: \"5a48eac5-1dc7-4478-ac71-084ad6302324\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j7gl6" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.734926 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ffc56e3-143d-4b1d-8ba3-1619951778e7-metrics-tls\") pod \"dns-operator-744455d44c-hw8fn\" (UID: \"2ffc56e3-143d-4b1d-8ba3-1619951778e7\") " pod="openshift-dns-operator/dns-operator-744455d44c-hw8fn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.734956 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/98199f15-3582-4b64-a118-c97f9ddadd11-encryption-config\") pod \"apiserver-76f77b778f-kzdbd\" (UID: \"98199f15-3582-4b64-a118-c97f9ddadd11\") " pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.734977 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b5e417ed-5b5e-405b-8b95-ed27ddaef9ee-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7nrwv\" (UID: \"b5e417ed-5b5e-405b-8b95-ed27ddaef9ee\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7nrwv" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.735126 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwz26\" (UniqueName: \"kubernetes.io/projected/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-kube-api-access-vwz26\") pod \"console-f9d7485db-vnvw7\" (UID: \"9cae35f2-fcf0-4014-9b5b-9887d416e8d3\") " pod="openshift-console/console-f9d7485db-vnvw7" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.735152 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/352c3d22-5aaa-47c1-a712-9022736511f4-serving-cert\") pod \"apiserver-7bbb656c7d-zp5qn\" (UID: \"352c3d22-5aaa-47c1-a712-9022736511f4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.735171 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a5b48e24-98c3-4875-9e6f-9d2de0463048-auth-proxy-config\") pod \"machine-approver-56656f9798-crfxb\" (UID: \"a5b48e24-98c3-4875-9e6f-9d2de0463048\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crfxb" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.735191 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5b48e24-98c3-4875-9e6f-9d2de0463048-config\") pod \"machine-approver-56656f9798-crfxb\" (UID: \"a5b48e24-98c3-4875-9e6f-9d2de0463048\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crfxb" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.735227 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/83f48fb3-47e0-4266-9424-b54b47551fce-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-zh5qn\" (UID: \"83f48fb3-47e0-4266-9424-b54b47551fce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zh5qn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.735259 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/98199f15-3582-4b64-a118-c97f9ddadd11-node-pullsecrets\") pod \"apiserver-76f77b778f-kzdbd\" (UID: \"98199f15-3582-4b64-a118-c97f9ddadd11\") " pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.735317 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72b9524f-58b1-451e-91e9-f244f763165d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jmqgl\" (UID: \"72b9524f-58b1-451e-91e9-f244f763165d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jmqgl" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.735340 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmklr\" (UniqueName: \"kubernetes.io/projected/83f48fb3-47e0-4266-9424-b54b47551fce-kube-api-access-rmklr\") pod \"cluster-image-registry-operator-dc59b4c8b-zh5qn\" (UID: \"83f48fb3-47e0-4266-9424-b54b47551fce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zh5qn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.735362 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd925ec4-97eb-4780-82a0-e7dc782542ce-config\") pod \"console-operator-58897d9998-7qr8w\" (UID: \"bd925ec4-97eb-4780-82a0-e7dc782542ce\") " pod="openshift-console-operator/console-operator-58897d9998-7qr8w" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.735396 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcfcz\" (UniqueName: \"kubernetes.io/projected/936c81dd-638d-4776-86fe-54a8aa53e50e-kube-api-access-vcfcz\") pod \"controller-manager-879f6c89f-vpxwc\" (UID: \"936c81dd-638d-4776-86fe-54a8aa53e50e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vpxwc" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.735419 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z6f4\" (UniqueName: \"kubernetes.io/projected/72b9524f-58b1-451e-91e9-f244f763165d-kube-api-access-8z6f4\") pod \"openshift-controller-manager-operator-756b6f6bc6-jmqgl\" (UID: \"72b9524f-58b1-451e-91e9-f244f763165d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jmqgl" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.735439 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b266\" (UniqueName: \"kubernetes.io/projected/07e852a6-9dbf-4533-85ad-d64d008bf488-kube-api-access-6b266\") pod \"migrator-59844c95c7-vkxrc\" (UID: \"07e852a6-9dbf-4533-85ad-d64d008bf488\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vkxrc" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.735759 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pnvpf"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.735459 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/83f48fb3-47e0-4266-9424-b54b47551fce-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-zh5qn\" (UID: \"83f48fb3-47e0-4266-9424-b54b47551fce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zh5qn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.735849 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b5a60df-c3e9-4553-92d2-0c8d8752d007-config\") pod \"openshift-apiserver-operator-796bbdcf4f-pnvpf\" (UID: \"3b5a60df-c3e9-4553-92d2-0c8d8752d007\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pnvpf" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.735888 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/936c81dd-638d-4776-86fe-54a8aa53e50e-serving-cert\") pod \"controller-manager-879f6c89f-vpxwc\" (UID: \"936c81dd-638d-4776-86fe-54a8aa53e50e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vpxwc" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.735938 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcmrb\" (UniqueName: \"kubernetes.io/projected/b5e417ed-5b5e-405b-8b95-ed27ddaef9ee-kube-api-access-gcmrb\") pod \"machine-api-operator-5694c8668f-7nrwv\" (UID: \"b5e417ed-5b5e-405b-8b95-ed27ddaef9ee\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7nrwv" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.735981 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxhpd\" (UniqueName: \"kubernetes.io/projected/352c3d22-5aaa-47c1-a712-9022736511f4-kube-api-access-kxhpd\") pod \"apiserver-7bbb656c7d-zp5qn\" (UID: \"352c3d22-5aaa-47c1-a712-9022736511f4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.736779 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6c0571f5-aaf0-4189-a20c-66f5b173ea49-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7z59k\" (UID: \"6c0571f5-aaf0-4189-a20c-66f5b173ea49\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7z59k" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.739951 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98199f15-3582-4b64-a118-c97f9ddadd11-config\") pod \"apiserver-76f77b778f-kzdbd\" (UID: \"98199f15-3582-4b64-a118-c97f9ddadd11\") " pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.740310 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-console-serving-cert\") pod \"console-f9d7485db-vnvw7\" (UID: \"9cae35f2-fcf0-4014-9b5b-9887d416e8d3\") " pod="openshift-console/console-f9d7485db-vnvw7" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.740551 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd925ec4-97eb-4780-82a0-e7dc782542ce-config\") pod \"console-operator-58897d9998-7qr8w\" (UID: \"bd925ec4-97eb-4780-82a0-e7dc782542ce\") " pod="openshift-console-operator/console-operator-58897d9998-7qr8w" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.740710 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/352c3d22-5aaa-47c1-a712-9022736511f4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zp5qn\" (UID: \"352c3d22-5aaa-47c1-a712-9022736511f4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.740829 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b5a60df-c3e9-4553-92d2-0c8d8752d007-config\") pod \"openshift-apiserver-operator-796bbdcf4f-pnvpf\" (UID: \"3b5a60df-c3e9-4553-92d2-0c8d8752d007\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pnvpf" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.741736 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0c1f48e-db09-49cd-82db-e687ea384d2c-config\") pod \"authentication-operator-69f744f599-5nrkx\" (UID: \"c0c1f48e-db09-49cd-82db-e687ea384d2c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5nrkx" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.742113 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/936c81dd-638d-4776-86fe-54a8aa53e50e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-vpxwc\" (UID: \"936c81dd-638d-4776-86fe-54a8aa53e50e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vpxwc" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.742172 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0c1f48e-db09-49cd-82db-e687ea384d2c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-5nrkx\" (UID: \"c0c1f48e-db09-49cd-82db-e687ea384d2c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5nrkx" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.742275 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtmhw\" (UniqueName: \"kubernetes.io/projected/c0c1f48e-db09-49cd-82db-e687ea384d2c-kube-api-access-xtmhw\") pod \"authentication-operator-69f744f599-5nrkx\" (UID: \"c0c1f48e-db09-49cd-82db-e687ea384d2c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5nrkx" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.742305 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd925ec4-97eb-4780-82a0-e7dc782542ce-trusted-ca\") pod \"console-operator-58897d9998-7qr8w\" (UID: \"bd925ec4-97eb-4780-82a0-e7dc782542ce\") " pod="openshift-console-operator/console-operator-58897d9998-7qr8w" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.742363 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-dk4v5"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.742804 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a5b48e24-98c3-4875-9e6f-9d2de0463048-auth-proxy-config\") pod \"machine-approver-56656f9798-crfxb\" (UID: \"a5b48e24-98c3-4875-9e6f-9d2de0463048\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crfxb" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.743025 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0c1f48e-db09-49cd-82db-e687ea384d2c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-5nrkx\" (UID: \"c0c1f48e-db09-49cd-82db-e687ea384d2c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5nrkx" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.743198 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dk4v5" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.743498 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/352c3d22-5aaa-47c1-a712-9022736511f4-audit-policies\") pod \"apiserver-7bbb656c7d-zp5qn\" (UID: \"352c3d22-5aaa-47c1-a712-9022736511f4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.743635 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/98199f15-3582-4b64-a118-c97f9ddadd11-node-pullsecrets\") pod \"apiserver-76f77b778f-kzdbd\" (UID: \"98199f15-3582-4b64-a118-c97f9ddadd11\") " pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.744295 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5b48e24-98c3-4875-9e6f-9d2de0463048-config\") pod \"machine-approver-56656f9798-crfxb\" (UID: \"a5b48e24-98c3-4875-9e6f-9d2de0463048\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crfxb" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.737761 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/98199f15-3582-4b64-a118-c97f9ddadd11-audit\") pod \"apiserver-76f77b778f-kzdbd\" (UID: \"98199f15-3582-4b64-a118-c97f9ddadd11\") " pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.744917 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98199f15-3582-4b64-a118-c97f9ddadd11-trusted-ca-bundle\") pod \"apiserver-76f77b778f-kzdbd\" (UID: \"98199f15-3582-4b64-a118-c97f9ddadd11\") " pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.745636 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/98199f15-3582-4b64-a118-c97f9ddadd11-etcd-client\") pod \"apiserver-76f77b778f-kzdbd\" (UID: \"98199f15-3582-4b64-a118-c97f9ddadd11\") " pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.745867 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/352c3d22-5aaa-47c1-a712-9022736511f4-encryption-config\") pod \"apiserver-7bbb656c7d-zp5qn\" (UID: \"352c3d22-5aaa-47c1-a712-9022736511f4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.746049 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a48eac5-1dc7-4478-ac71-084ad6302324-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-j7gl6\" (UID: \"5a48eac5-1dc7-4478-ac71-084ad6302324\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j7gl6" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.746551 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/40de7d67-3283-492a-bfc8-65c83c19421f-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-pdnsv\" (UID: \"40de7d67-3283-492a-bfc8-65c83c19421f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pdnsv" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.746728 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlrgl\" (UniqueName: \"kubernetes.io/projected/98199f15-3582-4b64-a118-c97f9ddadd11-kube-api-access-mlrgl\") pod \"apiserver-76f77b778f-kzdbd\" (UID: \"98199f15-3582-4b64-a118-c97f9ddadd11\") " pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.746830 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/352c3d22-5aaa-47c1-a712-9022736511f4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zp5qn\" (UID: \"352c3d22-5aaa-47c1-a712-9022736511f4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.746953 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/83f48fb3-47e0-4266-9424-b54b47551fce-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-zh5qn\" (UID: \"83f48fb3-47e0-4266-9424-b54b47551fce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zh5qn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.747069 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/004c9445-7b3b-4479-bf6e-e6d880e4c7bb-client-ca\") pod \"route-controller-manager-6576b87f9c-rrmzx\" (UID: \"004c9445-7b3b-4479-bf6e-e6d880e4c7bb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrmzx" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.747112 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrq69\" (UniqueName: \"kubernetes.io/projected/2ffc56e3-143d-4b1d-8ba3-1619951778e7-kube-api-access-rrq69\") pod \"dns-operator-744455d44c-hw8fn\" (UID: \"2ffc56e3-143d-4b1d-8ba3-1619951778e7\") " pod="openshift-dns-operator/dns-operator-744455d44c-hw8fn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.747160 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0c1f48e-db09-49cd-82db-e687ea384d2c-service-ca-bundle\") pod \"authentication-operator-69f744f599-5nrkx\" (UID: \"c0c1f48e-db09-49cd-82db-e687ea384d2c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5nrkx" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.747183 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/98199f15-3582-4b64-a118-c97f9ddadd11-audit-dir\") pod \"apiserver-76f77b778f-kzdbd\" (UID: \"98199f15-3582-4b64-a118-c97f9ddadd11\") " pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.747201 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/352c3d22-5aaa-47c1-a712-9022736511f4-audit-dir\") pod \"apiserver-7bbb656c7d-zp5qn\" (UID: \"352c3d22-5aaa-47c1-a712-9022736511f4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.747236 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9621ba12-6602-4afb-80bb-116e84daef13-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pdhbv\" (UID: \"9621ba12-6602-4afb-80bb-116e84daef13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pdhbv" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.747262 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5e417ed-5b5e-405b-8b95-ed27ddaef9ee-config\") pod \"machine-api-operator-5694c8668f-7nrwv\" (UID: \"b5e417ed-5b5e-405b-8b95-ed27ddaef9ee\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7nrwv" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.747280 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b5e417ed-5b5e-405b-8b95-ed27ddaef9ee-images\") pod \"machine-api-operator-5694c8668f-7nrwv\" (UID: \"b5e417ed-5b5e-405b-8b95-ed27ddaef9ee\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7nrwv" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.747315 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkqd7\" (UniqueName: \"kubernetes.io/projected/a5b48e24-98c3-4875-9e6f-9d2de0463048-kube-api-access-fkqd7\") pod \"machine-approver-56656f9798-crfxb\" (UID: \"a5b48e24-98c3-4875-9e6f-9d2de0463048\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crfxb" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.747332 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72b9524f-58b1-451e-91e9-f244f763165d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jmqgl\" (UID: \"72b9524f-58b1-451e-91e9-f244f763165d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jmqgl" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.747349 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-console-oauth-config\") pod \"console-f9d7485db-vnvw7\" (UID: \"9cae35f2-fcf0-4014-9b5b-9887d416e8d3\") " pod="openshift-console/console-f9d7485db-vnvw7" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.747389 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd925ec4-97eb-4780-82a0-e7dc782542ce-serving-cert\") pod \"console-operator-58897d9998-7qr8w\" (UID: \"bd925ec4-97eb-4780-82a0-e7dc782542ce\") " pod="openshift-console-operator/console-operator-58897d9998-7qr8w" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.747410 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/98199f15-3582-4b64-a118-c97f9ddadd11-etcd-serving-ca\") pod \"apiserver-76f77b778f-kzdbd\" (UID: \"98199f15-3582-4b64-a118-c97f9ddadd11\") " pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.747426 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/352c3d22-5aaa-47c1-a712-9022736511f4-etcd-client\") pod \"apiserver-7bbb656c7d-zp5qn\" (UID: \"352c3d22-5aaa-47c1-a712-9022736511f4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.747444 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-console-config\") pod \"console-f9d7485db-vnvw7\" (UID: \"9cae35f2-fcf0-4014-9b5b-9887d416e8d3\") " pod="openshift-console/console-f9d7485db-vnvw7" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.747482 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e9be6b1c-e3c3-470e-9387-e2a6abd005aa-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-l9c9f\" (UID: \"e9be6b1c-e3c3-470e-9387-e2a6abd005aa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l9c9f" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.747488 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/352c3d22-5aaa-47c1-a712-9022736511f4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zp5qn\" (UID: \"352c3d22-5aaa-47c1-a712-9022736511f4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.747498 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/98199f15-3582-4b64-a118-c97f9ddadd11-image-import-ca\") pod \"apiserver-76f77b778f-kzdbd\" (UID: \"98199f15-3582-4b64-a118-c97f9ddadd11\") " pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.747560 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/004c9445-7b3b-4479-bf6e-e6d880e4c7bb-config\") pod \"route-controller-manager-6576b87f9c-rrmzx\" (UID: \"004c9445-7b3b-4479-bf6e-e6d880e4c7bb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrmzx" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.747579 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vx9v\" (UniqueName: \"kubernetes.io/projected/e9be6b1c-e3c3-470e-9387-e2a6abd005aa-kube-api-access-6vx9v\") pod \"cluster-samples-operator-665b6dd947-l9c9f\" (UID: \"e9be6b1c-e3c3-470e-9387-e2a6abd005aa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l9c9f" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.747596 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98199f15-3582-4b64-a118-c97f9ddadd11-serving-cert\") pod \"apiserver-76f77b778f-kzdbd\" (UID: \"98199f15-3582-4b64-a118-c97f9ddadd11\") " pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.747629 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57phc\" (UniqueName: \"kubernetes.io/projected/3b5a60df-c3e9-4553-92d2-0c8d8752d007-kube-api-access-57phc\") pod \"openshift-apiserver-operator-796bbdcf4f-pnvpf\" (UID: \"3b5a60df-c3e9-4553-92d2-0c8d8752d007\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pnvpf" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.747673 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d562068-166d-42b5-9f5f-3e33460f5410-serving-cert\") pod \"openshift-config-operator-7777fb866f-8s9fl\" (UID: \"0d562068-166d-42b5-9f5f-3e33460f5410\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8s9fl" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.747694 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/936c81dd-638d-4776-86fe-54a8aa53e50e-client-ca\") pod \"controller-manager-879f6c89f-vpxwc\" (UID: \"936c81dd-638d-4776-86fe-54a8aa53e50e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vpxwc" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.747713 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0d562068-166d-42b5-9f5f-3e33460f5410-available-featuregates\") pod \"openshift-config-operator-7777fb866f-8s9fl\" (UID: \"0d562068-166d-42b5-9f5f-3e33460f5410\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8s9fl" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.748069 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a5b48e24-98c3-4875-9e6f-9d2de0463048-machine-approver-tls\") pod \"machine-approver-56656f9798-crfxb\" (UID: \"a5b48e24-98c3-4875-9e6f-9d2de0463048\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crfxb" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.748071 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0d562068-166d-42b5-9f5f-3e33460f5410-available-featuregates\") pod \"openshift-config-operator-7777fb866f-8s9fl\" (UID: \"0d562068-166d-42b5-9f5f-3e33460f5410\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8s9fl" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.748334 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b5e417ed-5b5e-405b-8b95-ed27ddaef9ee-images\") pod \"machine-api-operator-5694c8668f-7nrwv\" (UID: \"b5e417ed-5b5e-405b-8b95-ed27ddaef9ee\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7nrwv" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.748567 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/98199f15-3582-4b64-a118-c97f9ddadd11-encryption-config\") pod \"apiserver-76f77b778f-kzdbd\" (UID: \"98199f15-3582-4b64-a118-c97f9ddadd11\") " pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.748902 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/936c81dd-638d-4776-86fe-54a8aa53e50e-config\") pod \"controller-manager-879f6c89f-vpxwc\" (UID: \"936c81dd-638d-4776-86fe-54a8aa53e50e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vpxwc" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.749259 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98199f15-3582-4b64-a118-c97f9ddadd11-trusted-ca-bundle\") pod \"apiserver-76f77b778f-kzdbd\" (UID: \"98199f15-3582-4b64-a118-c97f9ddadd11\") " pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.749334 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/004c9445-7b3b-4479-bf6e-e6d880e4c7bb-serving-cert\") pod \"route-controller-manager-6576b87f9c-rrmzx\" (UID: \"004c9445-7b3b-4479-bf6e-e6d880e4c7bb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrmzx" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.749408 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/98199f15-3582-4b64-a118-c97f9ddadd11-audit-dir\") pod \"apiserver-76f77b778f-kzdbd\" (UID: \"98199f15-3582-4b64-a118-c97f9ddadd11\") " pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.749503 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/98199f15-3582-4b64-a118-c97f9ddadd11-etcd-serving-ca\") pod \"apiserver-76f77b778f-kzdbd\" (UID: \"98199f15-3582-4b64-a118-c97f9ddadd11\") " pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.749536 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/004c9445-7b3b-4479-bf6e-e6d880e4c7bb-config\") pod \"route-controller-manager-6576b87f9c-rrmzx\" (UID: \"004c9445-7b3b-4479-bf6e-e6d880e4c7bb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrmzx" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.750343 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/004c9445-7b3b-4479-bf6e-e6d880e4c7bb-client-ca\") pod \"route-controller-manager-6576b87f9c-rrmzx\" (UID: \"004c9445-7b3b-4479-bf6e-e6d880e4c7bb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrmzx" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.750635 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0c1f48e-db09-49cd-82db-e687ea384d2c-service-ca-bundle\") pod \"authentication-operator-69f744f599-5nrkx\" (UID: \"c0c1f48e-db09-49cd-82db-e687ea384d2c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5nrkx" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.750682 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/352c3d22-5aaa-47c1-a712-9022736511f4-audit-dir\") pod \"apiserver-7bbb656c7d-zp5qn\" (UID: \"352c3d22-5aaa-47c1-a712-9022736511f4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.750760 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/352c3d22-5aaa-47c1-a712-9022736511f4-serving-cert\") pod \"apiserver-7bbb656c7d-zp5qn\" (UID: \"352c3d22-5aaa-47c1-a712-9022736511f4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.750953 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/98199f15-3582-4b64-a118-c97f9ddadd11-image-import-ca\") pod \"apiserver-76f77b778f-kzdbd\" (UID: \"98199f15-3582-4b64-a118-c97f9ddadd11\") " pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.751142 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/936c81dd-638d-4776-86fe-54a8aa53e50e-client-ca\") pod \"controller-manager-879f6c89f-vpxwc\" (UID: \"936c81dd-638d-4776-86fe-54a8aa53e50e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vpxwc" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.751255 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5e417ed-5b5e-405b-8b95-ed27ddaef9ee-config\") pod \"machine-api-operator-5694c8668f-7nrwv\" (UID: \"b5e417ed-5b5e-405b-8b95-ed27ddaef9ee\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7nrwv" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.751449 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/352c3d22-5aaa-47c1-a712-9022736511f4-encryption-config\") pod \"apiserver-7bbb656c7d-zp5qn\" (UID: \"352c3d22-5aaa-47c1-a712-9022736511f4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.751911 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-7qr8w"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.751931 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/936c81dd-638d-4776-86fe-54a8aa53e50e-serving-cert\") pod \"controller-manager-879f6c89f-vpxwc\" (UID: \"936c81dd-638d-4776-86fe-54a8aa53e50e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vpxwc" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.752065 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98199f15-3582-4b64-a118-c97f9ddadd11-serving-cert\") pod \"apiserver-76f77b778f-kzdbd\" (UID: \"98199f15-3582-4b64-a118-c97f9ddadd11\") " pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.752528 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e9be6b1c-e3c3-470e-9387-e2a6abd005aa-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-l9c9f\" (UID: \"e9be6b1c-e3c3-470e-9387-e2a6abd005aa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l9c9f" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.753450 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b5e417ed-5b5e-405b-8b95-ed27ddaef9ee-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7nrwv\" (UID: \"b5e417ed-5b5e-405b-8b95-ed27ddaef9ee\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7nrwv" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.753628 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0c1f48e-db09-49cd-82db-e687ea384d2c-serving-cert\") pod \"authentication-operator-69f744f599-5nrkx\" (UID: \"c0c1f48e-db09-49cd-82db-e687ea384d2c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5nrkx" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.753841 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/352c3d22-5aaa-47c1-a712-9022736511f4-etcd-client\") pod \"apiserver-7bbb656c7d-zp5qn\" (UID: \"352c3d22-5aaa-47c1-a712-9022736511f4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.754564 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b5a60df-c3e9-4553-92d2-0c8d8752d007-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-pnvpf\" (UID: \"3b5a60df-c3e9-4553-92d2-0c8d8752d007\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pnvpf" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.754608 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pdhbv"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.754791 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d562068-166d-42b5-9f5f-3e33460f5410-serving-cert\") pod \"openshift-config-operator-7777fb866f-8s9fl\" (UID: \"0d562068-166d-42b5-9f5f-3e33460f5410\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8s9fl" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.754793 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd925ec4-97eb-4780-82a0-e7dc782542ce-serving-cert\") pod \"console-operator-58897d9998-7qr8w\" (UID: \"bd925ec4-97eb-4780-82a0-e7dc782542ce\") " pod="openshift-console-operator/console-operator-58897d9998-7qr8w" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.755734 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jmqgl"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.756924 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zh5qn"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.758510 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-8s9fl"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.764030 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.765316 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwml7"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.766778 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-p27cs"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.768078 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7z59k"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.769399 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-8pqx5"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.770308 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8pqx5" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.771998 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-pdnsv"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.773086 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-vkxrc"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.775243 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-t8q7s"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.775882 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-t8q7s" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.776743 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gwjdm"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.778215 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-kcqsn"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.779538 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-5nrkx"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.781085 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-vnvw7"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.782820 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jsbrr"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.784082 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7qxcv"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.784443 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.786094 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gwcm9"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.787415 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4sjxw"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.788735 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j7gl6"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.789597 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-jh8hv"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.790759 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s7km8"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.791827 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-dz4j8"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.793015 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-wjcbg"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.794100 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8pqx5"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.795697 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4rcc5"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.797299 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dk4v5"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.799670 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jtjpd"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.801208 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-jtjpd" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.801691 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wq2sd"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.803755 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jtjpd"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.803985 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.805029 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325045-8kfbc"] Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.824346 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.844274 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.848884 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c0571f5-aaf0-4189-a20c-66f5b173ea49-trusted-ca\") pod \"ingress-operator-5b745b69d9-7z59k\" (UID: \"6c0571f5-aaf0-4189-a20c-66f5b173ea49\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7z59k" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.848927 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9621ba12-6602-4afb-80bb-116e84daef13-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pdhbv\" (UID: \"9621ba12-6602-4afb-80bb-116e84daef13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pdhbv" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.848961 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a48eac5-1dc7-4478-ac71-084ad6302324-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-j7gl6\" (UID: \"5a48eac5-1dc7-4478-ac71-084ad6302324\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j7gl6" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.848979 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ffc56e3-143d-4b1d-8ba3-1619951778e7-metrics-tls\") pod \"dns-operator-744455d44c-hw8fn\" (UID: \"2ffc56e3-143d-4b1d-8ba3-1619951778e7\") " pod="openshift-dns-operator/dns-operator-744455d44c-hw8fn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.849004 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwz26\" (UniqueName: \"kubernetes.io/projected/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-kube-api-access-vwz26\") pod \"console-f9d7485db-vnvw7\" (UID: \"9cae35f2-fcf0-4014-9b5b-9887d416e8d3\") " pod="openshift-console/console-f9d7485db-vnvw7" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.849022 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/83f48fb3-47e0-4266-9424-b54b47551fce-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-zh5qn\" (UID: \"83f48fb3-47e0-4266-9424-b54b47551fce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zh5qn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.849041 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72b9524f-58b1-451e-91e9-f244f763165d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jmqgl\" (UID: \"72b9524f-58b1-451e-91e9-f244f763165d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jmqgl" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.849057 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmklr\" (UniqueName: \"kubernetes.io/projected/83f48fb3-47e0-4266-9424-b54b47551fce-kube-api-access-rmklr\") pod \"cluster-image-registry-operator-dc59b4c8b-zh5qn\" (UID: \"83f48fb3-47e0-4266-9424-b54b47551fce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zh5qn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.849081 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z6f4\" (UniqueName: \"kubernetes.io/projected/72b9524f-58b1-451e-91e9-f244f763165d-kube-api-access-8z6f4\") pod \"openshift-controller-manager-operator-756b6f6bc6-jmqgl\" (UID: \"72b9524f-58b1-451e-91e9-f244f763165d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jmqgl" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.849097 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b266\" (UniqueName: \"kubernetes.io/projected/07e852a6-9dbf-4533-85ad-d64d008bf488-kube-api-access-6b266\") pod \"migrator-59844c95c7-vkxrc\" (UID: \"07e852a6-9dbf-4533-85ad-d64d008bf488\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vkxrc" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.849112 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/83f48fb3-47e0-4266-9424-b54b47551fce-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-zh5qn\" (UID: \"83f48fb3-47e0-4266-9424-b54b47551fce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zh5qn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.849136 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6c0571f5-aaf0-4189-a20c-66f5b173ea49-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7z59k\" (UID: \"6c0571f5-aaf0-4189-a20c-66f5b173ea49\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7z59k" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.849150 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-console-serving-cert\") pod \"console-f9d7485db-vnvw7\" (UID: \"9cae35f2-fcf0-4014-9b5b-9887d416e8d3\") " pod="openshift-console/console-f9d7485db-vnvw7" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.849183 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a48eac5-1dc7-4478-ac71-084ad6302324-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-j7gl6\" (UID: \"5a48eac5-1dc7-4478-ac71-084ad6302324\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j7gl6" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.849197 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/40de7d67-3283-492a-bfc8-65c83c19421f-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-pdnsv\" (UID: \"40de7d67-3283-492a-bfc8-65c83c19421f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pdnsv" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.849219 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/83f48fb3-47e0-4266-9424-b54b47551fce-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-zh5qn\" (UID: \"83f48fb3-47e0-4266-9424-b54b47551fce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zh5qn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.849235 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrq69\" (UniqueName: \"kubernetes.io/projected/2ffc56e3-143d-4b1d-8ba3-1619951778e7-kube-api-access-rrq69\") pod \"dns-operator-744455d44c-hw8fn\" (UID: \"2ffc56e3-143d-4b1d-8ba3-1619951778e7\") " pod="openshift-dns-operator/dns-operator-744455d44c-hw8fn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.849251 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9621ba12-6602-4afb-80bb-116e84daef13-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pdhbv\" (UID: \"9621ba12-6602-4afb-80bb-116e84daef13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pdhbv" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.849280 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72b9524f-58b1-451e-91e9-f244f763165d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jmqgl\" (UID: \"72b9524f-58b1-451e-91e9-f244f763165d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jmqgl" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.849296 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-console-oauth-config\") pod \"console-f9d7485db-vnvw7\" (UID: \"9cae35f2-fcf0-4014-9b5b-9887d416e8d3\") " pod="openshift-console/console-f9d7485db-vnvw7" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.849311 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-console-config\") pod \"console-f9d7485db-vnvw7\" (UID: \"9cae35f2-fcf0-4014-9b5b-9887d416e8d3\") " pod="openshift-console/console-f9d7485db-vnvw7" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.849347 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6c0571f5-aaf0-4189-a20c-66f5b173ea49-metrics-tls\") pod \"ingress-operator-5b745b69d9-7z59k\" (UID: \"6c0571f5-aaf0-4189-a20c-66f5b173ea49\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7z59k" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.849362 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fd62\" (UniqueName: \"kubernetes.io/projected/6c0571f5-aaf0-4189-a20c-66f5b173ea49-kube-api-access-8fd62\") pod \"ingress-operator-5b745b69d9-7z59k\" (UID: \"6c0571f5-aaf0-4189-a20c-66f5b173ea49\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7z59k" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.849407 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sw2q\" (UniqueName: \"kubernetes.io/projected/40de7d67-3283-492a-bfc8-65c83c19421f-kube-api-access-6sw2q\") pod \"machine-config-controller-84d6567774-pdnsv\" (UID: \"40de7d67-3283-492a-bfc8-65c83c19421f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pdnsv" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.849433 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-service-ca\") pod \"console-f9d7485db-vnvw7\" (UID: \"9cae35f2-fcf0-4014-9b5b-9887d416e8d3\") " pod="openshift-console/console-f9d7485db-vnvw7" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.849451 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-trusted-ca-bundle\") pod \"console-f9d7485db-vnvw7\" (UID: \"9cae35f2-fcf0-4014-9b5b-9887d416e8d3\") " pod="openshift-console/console-f9d7485db-vnvw7" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.849469 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrq22\" (UniqueName: \"kubernetes.io/projected/5a48eac5-1dc7-4478-ac71-084ad6302324-kube-api-access-xrq22\") pod \"kube-storage-version-migrator-operator-b67b599dd-j7gl6\" (UID: \"5a48eac5-1dc7-4478-ac71-084ad6302324\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j7gl6" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.849485 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/40de7d67-3283-492a-bfc8-65c83c19421f-proxy-tls\") pod \"machine-config-controller-84d6567774-pdnsv\" (UID: \"40de7d67-3283-492a-bfc8-65c83c19421f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pdnsv" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.849502 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9621ba12-6602-4afb-80bb-116e84daef13-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pdhbv\" (UID: \"9621ba12-6602-4afb-80bb-116e84daef13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pdhbv" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.849517 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-oauth-serving-cert\") pod \"console-f9d7485db-vnvw7\" (UID: \"9cae35f2-fcf0-4014-9b5b-9887d416e8d3\") " pod="openshift-console/console-f9d7485db-vnvw7" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.850352 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-oauth-serving-cert\") pod \"console-f9d7485db-vnvw7\" (UID: \"9cae35f2-fcf0-4014-9b5b-9887d416e8d3\") " pod="openshift-console/console-f9d7485db-vnvw7" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.850817 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-console-config\") pod \"console-f9d7485db-vnvw7\" (UID: \"9cae35f2-fcf0-4014-9b5b-9887d416e8d3\") " pod="openshift-console/console-f9d7485db-vnvw7" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.850987 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-service-ca\") pod \"console-f9d7485db-vnvw7\" (UID: \"9cae35f2-fcf0-4014-9b5b-9887d416e8d3\") " pod="openshift-console/console-f9d7485db-vnvw7" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.851902 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/40de7d67-3283-492a-bfc8-65c83c19421f-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-pdnsv\" (UID: \"40de7d67-3283-492a-bfc8-65c83c19421f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pdnsv" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.852469 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-trusted-ca-bundle\") pod \"console-f9d7485db-vnvw7\" (UID: \"9cae35f2-fcf0-4014-9b5b-9887d416e8d3\") " pod="openshift-console/console-f9d7485db-vnvw7" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.852741 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a48eac5-1dc7-4478-ac71-084ad6302324-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-j7gl6\" (UID: \"5a48eac5-1dc7-4478-ac71-084ad6302324\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j7gl6" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.852746 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/83f48fb3-47e0-4266-9424-b54b47551fce-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-zh5qn\" (UID: \"83f48fb3-47e0-4266-9424-b54b47551fce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zh5qn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.852759 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-console-serving-cert\") pod \"console-f9d7485db-vnvw7\" (UID: \"9cae35f2-fcf0-4014-9b5b-9887d416e8d3\") " pod="openshift-console/console-f9d7485db-vnvw7" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.853767 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ffc56e3-143d-4b1d-8ba3-1619951778e7-metrics-tls\") pod \"dns-operator-744455d44c-hw8fn\" (UID: \"2ffc56e3-143d-4b1d-8ba3-1619951778e7\") " pod="openshift-dns-operator/dns-operator-744455d44c-hw8fn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.853824 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/83f48fb3-47e0-4266-9424-b54b47551fce-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-zh5qn\" (UID: \"83f48fb3-47e0-4266-9424-b54b47551fce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zh5qn" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.854159 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-console-oauth-config\") pod \"console-f9d7485db-vnvw7\" (UID: \"9cae35f2-fcf0-4014-9b5b-9887d416e8d3\") " pod="openshift-console/console-f9d7485db-vnvw7" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.867091 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.872321 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a48eac5-1dc7-4478-ac71-084ad6302324-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-j7gl6\" (UID: \"5a48eac5-1dc7-4478-ac71-084ad6302324\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j7gl6" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.884216 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.920993 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.924355 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.932110 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c0571f5-aaf0-4189-a20c-66f5b173ea49-trusted-ca\") pod \"ingress-operator-5b745b69d9-7z59k\" (UID: \"6c0571f5-aaf0-4189-a20c-66f5b173ea49\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7z59k" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.944440 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.953195 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6c0571f5-aaf0-4189-a20c-66f5b173ea49-metrics-tls\") pod \"ingress-operator-5b745b69d9-7z59k\" (UID: \"6c0571f5-aaf0-4189-a20c-66f5b173ea49\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7z59k" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.963786 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 03 14:45:25 crc kubenswrapper[4774]: I1003 14:45:25.984405 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.004957 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.024470 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.043732 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.054028 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9621ba12-6602-4afb-80bb-116e84daef13-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pdhbv\" (UID: \"9621ba12-6602-4afb-80bb-116e84daef13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pdhbv" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.064449 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.069936 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9621ba12-6602-4afb-80bb-116e84daef13-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pdhbv\" (UID: \"9621ba12-6602-4afb-80bb-116e84daef13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pdhbv" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.083987 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.096321 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/40de7d67-3283-492a-bfc8-65c83c19421f-proxy-tls\") pod \"machine-config-controller-84d6567774-pdnsv\" (UID: \"40de7d67-3283-492a-bfc8-65c83c19421f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pdnsv" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.104346 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.124211 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.133117 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72b9524f-58b1-451e-91e9-f244f763165d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jmqgl\" (UID: \"72b9524f-58b1-451e-91e9-f244f763165d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jmqgl" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.145347 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.164662 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.184864 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.205102 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.224485 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.244862 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.263841 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.276231 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72b9524f-58b1-451e-91e9-f244f763165d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jmqgl\" (UID: \"72b9524f-58b1-451e-91e9-f244f763165d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jmqgl" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.284697 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.304732 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.325057 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.344418 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.364565 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.385047 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.425249 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.444038 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.463791 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.485245 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.504420 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.524446 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.544920 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.563901 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.583936 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.604853 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.623851 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.644441 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.662878 4774 request.go:700] Waited for 1.001559744s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmco-proxy-tls&limit=500&resourceVersion=0 Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.664713 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.683666 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.704396 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.724130 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.744262 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.768485 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.786951 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.803986 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.824195 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.844681 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.865165 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.884744 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.904454 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.924181 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.944546 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.964033 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 03 14:45:26 crc kubenswrapper[4774]: I1003 14:45:26.985068 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.006325 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.025288 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.052616 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.064280 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.084898 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.104745 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.124052 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.145247 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.165163 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.184572 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.204953 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.225830 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.244493 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.263934 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.305044 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsk88\" (UniqueName: \"kubernetes.io/projected/40ce0e8a-b3ee-4b5a-a3ef-ee0e4a573496-kube-api-access-fsk88\") pod \"downloads-7954f5f757-jh8hv\" (UID: \"40ce0e8a-b3ee-4b5a-a3ef-ee0e4a573496\") " pod="openshift-console/downloads-7954f5f757-jh8hv" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.307263 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-jh8hv" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.329614 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxs6v\" (UniqueName: \"kubernetes.io/projected/bd925ec4-97eb-4780-82a0-e7dc782542ce-kube-api-access-fxs6v\") pod \"console-operator-58897d9998-7qr8w\" (UID: \"bd925ec4-97eb-4780-82a0-e7dc782542ce\") " pod="openshift-console-operator/console-operator-58897d9998-7qr8w" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.346640 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxhpd\" (UniqueName: \"kubernetes.io/projected/352c3d22-5aaa-47c1-a712-9022736511f4-kube-api-access-kxhpd\") pod \"apiserver-7bbb656c7d-zp5qn\" (UID: \"352c3d22-5aaa-47c1-a712-9022736511f4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.357299 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwmnx\" (UniqueName: \"kubernetes.io/projected/004c9445-7b3b-4479-bf6e-e6d880e4c7bb-kube-api-access-zwmnx\") pod \"route-controller-manager-6576b87f9c-rrmzx\" (UID: \"004c9445-7b3b-4479-bf6e-e6d880e4c7bb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrmzx" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.376542 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-972t2\" (UniqueName: \"kubernetes.io/projected/0d562068-166d-42b5-9f5f-3e33460f5410-kube-api-access-972t2\") pod \"openshift-config-operator-7777fb866f-8s9fl\" (UID: \"0d562068-166d-42b5-9f5f-3e33460f5410\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8s9fl" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.400765 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcmrb\" (UniqueName: \"kubernetes.io/projected/b5e417ed-5b5e-405b-8b95-ed27ddaef9ee-kube-api-access-gcmrb\") pod \"machine-api-operator-5694c8668f-7nrwv\" (UID: \"b5e417ed-5b5e-405b-8b95-ed27ddaef9ee\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7nrwv" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.424998 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcfcz\" (UniqueName: \"kubernetes.io/projected/936c81dd-638d-4776-86fe-54a8aa53e50e-kube-api-access-vcfcz\") pod \"controller-manager-879f6c89f-vpxwc\" (UID: \"936c81dd-638d-4776-86fe-54a8aa53e50e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vpxwc" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.427508 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.441023 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtmhw\" (UniqueName: \"kubernetes.io/projected/c0c1f48e-db09-49cd-82db-e687ea384d2c-kube-api-access-xtmhw\") pod \"authentication-operator-69f744f599-5nrkx\" (UID: \"c0c1f48e-db09-49cd-82db-e687ea384d2c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5nrkx" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.444176 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrmzx" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.444877 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.464319 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.485164 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.498447 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-7qr8w" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.504667 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-jh8hv"] Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.505133 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.524698 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8s9fl" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.542205 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-5nrkx" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.542657 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlrgl\" (UniqueName: \"kubernetes.io/projected/98199f15-3582-4b64-a118-c97f9ddadd11-kube-api-access-mlrgl\") pod \"apiserver-76f77b778f-kzdbd\" (UID: \"98199f15-3582-4b64-a118-c97f9ddadd11\") " pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.583149 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkqd7\" (UniqueName: \"kubernetes.io/projected/a5b48e24-98c3-4875-9e6f-9d2de0463048-kube-api-access-fkqd7\") pod \"machine-approver-56656f9798-crfxb\" (UID: \"a5b48e24-98c3-4875-9e6f-9d2de0463048\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crfxb" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.608824 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vx9v\" (UniqueName: \"kubernetes.io/projected/e9be6b1c-e3c3-470e-9387-e2a6abd005aa-kube-api-access-6vx9v\") pod \"cluster-samples-operator-665b6dd947-l9c9f\" (UID: \"e9be6b1c-e3c3-470e-9387-e2a6abd005aa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l9c9f" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.609335 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vpxwc" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.609461 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn"] Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.623484 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57phc\" (UniqueName: \"kubernetes.io/projected/3b5a60df-c3e9-4553-92d2-0c8d8752d007-kube-api-access-57phc\") pod \"openshift-apiserver-operator-796bbdcf4f-pnvpf\" (UID: \"3b5a60df-c3e9-4553-92d2-0c8d8752d007\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pnvpf" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.623954 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.633726 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrmzx"] Oct 03 14:45:27 crc kubenswrapper[4774]: W1003 14:45:27.646801 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod004c9445_7b3b_4479_bf6e_e6d880e4c7bb.slice/crio-44d59f19cad3c4eb7cedc7ce28146ec22b2487fa1f51a5a5bd41ccbb40920bb5 WatchSource:0}: Error finding container 44d59f19cad3c4eb7cedc7ce28146ec22b2487fa1f51a5a5bd41ccbb40920bb5: Status 404 returned error can't find the container with id 44d59f19cad3c4eb7cedc7ce28146ec22b2487fa1f51a5a5bd41ccbb40920bb5 Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.647042 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.650915 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.664346 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.682852 4774 request.go:700] Waited for 1.906724304s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmachine-config-server-tls&limit=500&resourceVersion=0 Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.686955 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.687854 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7nrwv" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.704180 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.713588 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-8s9fl"] Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.726436 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 03 14:45:27 crc kubenswrapper[4774]: W1003 14:45:27.734821 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d562068_166d_42b5_9f5f_3e33460f5410.slice/crio-6ec9d6b5223806cda6cb0f2418cf8e095b9e122038f49c4134d2a8f1db5ffc5c WatchSource:0}: Error finding container 6ec9d6b5223806cda6cb0f2418cf8e095b9e122038f49c4134d2a8f1db5ffc5c: Status 404 returned error can't find the container with id 6ec9d6b5223806cda6cb0f2418cf8e095b9e122038f49c4134d2a8f1db5ffc5c Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.745349 4774 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.764183 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-7qr8w"] Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.765902 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.770306 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pnvpf" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.782968 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crfxb" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.784969 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.814909 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vpxwc"] Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.821863 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-5nrkx"] Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.826877 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwz26\" (UniqueName: \"kubernetes.io/projected/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-kube-api-access-vwz26\") pod \"console-f9d7485db-vnvw7\" (UID: \"9cae35f2-fcf0-4014-9b5b-9887d416e8d3\") " pod="openshift-console/console-f9d7485db-vnvw7" Oct 03 14:45:27 crc kubenswrapper[4774]: W1003 14:45:27.833290 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd925ec4_97eb_4780_82a0_e7dc782542ce.slice/crio-b1d18f631b44ec443a2225d9197c629d94f40441c94db0904614f568f4978d4f WatchSource:0}: Error finding container b1d18f631b44ec443a2225d9197c629d94f40441c94db0904614f568f4978d4f: Status 404 returned error can't find the container with id b1d18f631b44ec443a2225d9197c629d94f40441c94db0904614f568f4978d4f Oct 03 14:45:27 crc kubenswrapper[4774]: W1003 14:45:27.835953 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0c1f48e_db09_49cd_82db_e687ea384d2c.slice/crio-d98b0746961aea5f5d1337453b0f50c0ca0710c0982e8d3eb7c722b722919911 WatchSource:0}: Error finding container d98b0746961aea5f5d1337453b0f50c0ca0710c0982e8d3eb7c722b722919911: Status 404 returned error can't find the container with id d98b0746961aea5f5d1337453b0f50c0ca0710c0982e8d3eb7c722b722919911 Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.842170 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z6f4\" (UniqueName: \"kubernetes.io/projected/72b9524f-58b1-451e-91e9-f244f763165d-kube-api-access-8z6f4\") pod \"openshift-controller-manager-operator-756b6f6bc6-jmqgl\" (UID: \"72b9524f-58b1-451e-91e9-f244f763165d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jmqgl" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.861527 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b266\" (UniqueName: \"kubernetes.io/projected/07e852a6-9dbf-4533-85ad-d64d008bf488-kube-api-access-6b266\") pod \"migrator-59844c95c7-vkxrc\" (UID: \"07e852a6-9dbf-4533-85ad-d64d008bf488\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vkxrc" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.876700 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/83f48fb3-47e0-4266-9424-b54b47551fce-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-zh5qn\" (UID: \"83f48fb3-47e0-4266-9424-b54b47551fce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zh5qn" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.885200 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l9c9f" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.900077 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sw2q\" (UniqueName: \"kubernetes.io/projected/40de7d67-3283-492a-bfc8-65c83c19421f-kube-api-access-6sw2q\") pod \"machine-config-controller-84d6567774-pdnsv\" (UID: \"40de7d67-3283-492a-bfc8-65c83c19421f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pdnsv" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.902884 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-kzdbd"] Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.915882 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vnvw7" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.918714 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6c0571f5-aaf0-4189-a20c-66f5b173ea49-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7z59k\" (UID: \"6c0571f5-aaf0-4189-a20c-66f5b173ea49\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7z59k" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.943358 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fd62\" (UniqueName: \"kubernetes.io/projected/6c0571f5-aaf0-4189-a20c-66f5b173ea49-kube-api-access-8fd62\") pod \"ingress-operator-5b745b69d9-7z59k\" (UID: \"6c0571f5-aaf0-4189-a20c-66f5b173ea49\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7z59k" Oct 03 14:45:27 crc kubenswrapper[4774]: W1003 14:45:27.947643 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5b48e24_98c3_4875_9e6f_9d2de0463048.slice/crio-e39a6d22918ca385a9f96b89ba7c15ea9f8a0ccfef056f69f403de97d9ddf857 WatchSource:0}: Error finding container e39a6d22918ca385a9f96b89ba7c15ea9f8a0ccfef056f69f403de97d9ddf857: Status 404 returned error can't find the container with id e39a6d22918ca385a9f96b89ba7c15ea9f8a0ccfef056f69f403de97d9ddf857 Oct 03 14:45:27 crc kubenswrapper[4774]: W1003 14:45:27.948598 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98199f15_3582_4b64_a118_c97f9ddadd11.slice/crio-9f1d9aafe9c09a3e713a4bef33323e1c7ddbb4b5f951ca6454b9c997c0926608 WatchSource:0}: Error finding container 9f1d9aafe9c09a3e713a4bef33323e1c7ddbb4b5f951ca6454b9c997c0926608: Status 404 returned error can't find the container with id 9f1d9aafe9c09a3e713a4bef33323e1c7ddbb4b5f951ca6454b9c997c0926608 Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.958474 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrq22\" (UniqueName: \"kubernetes.io/projected/5a48eac5-1dc7-4478-ac71-084ad6302324-kube-api-access-xrq22\") pod \"kube-storage-version-migrator-operator-b67b599dd-j7gl6\" (UID: \"5a48eac5-1dc7-4478-ac71-084ad6302324\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j7gl6" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.972916 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j7gl6" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.981386 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7z59k" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.986640 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7nrwv"] Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.990861 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmklr\" (UniqueName: \"kubernetes.io/projected/83f48fb3-47e0-4266-9424-b54b47551fce-kube-api-access-rmklr\") pod \"cluster-image-registry-operator-dc59b4c8b-zh5qn\" (UID: \"83f48fb3-47e0-4266-9424-b54b47551fce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zh5qn" Oct 03 14:45:27 crc kubenswrapper[4774]: I1003 14:45:27.997567 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pdnsv" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.001708 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrq69\" (UniqueName: \"kubernetes.io/projected/2ffc56e3-143d-4b1d-8ba3-1619951778e7-kube-api-access-rrq69\") pod \"dns-operator-744455d44c-hw8fn\" (UID: \"2ffc56e3-143d-4b1d-8ba3-1619951778e7\") " pod="openshift-dns-operator/dns-operator-744455d44c-hw8fn" Oct 03 14:45:28 crc kubenswrapper[4774]: W1003 14:45:28.005702 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5e417ed_5b5e_405b_8b95_ed27ddaef9ee.slice/crio-cadc79a31540fb4f2c39f3d49abfa1491f760cb78d26540484cc5981dc744dbf WatchSource:0}: Error finding container cadc79a31540fb4f2c39f3d49abfa1491f760cb78d26540484cc5981dc744dbf: Status 404 returned error can't find the container with id cadc79a31540fb4f2c39f3d49abfa1491f760cb78d26540484cc5981dc744dbf Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.013853 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vpxwc" event={"ID":"936c81dd-638d-4776-86fe-54a8aa53e50e","Type":"ContainerStarted","Data":"8f1f4acc6d1aaba5181a3dcccc9fa5d2ab7f5d710936413b0d4d9db38922dfbb"} Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.014846 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7nrwv" event={"ID":"b5e417ed-5b5e-405b-8b95-ed27ddaef9ee","Type":"ContainerStarted","Data":"cadc79a31540fb4f2c39f3d49abfa1491f760cb78d26540484cc5981dc744dbf"} Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.015696 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jmqgl" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.018866 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8s9fl" event={"ID":"0d562068-166d-42b5-9f5f-3e33460f5410","Type":"ContainerStarted","Data":"6ec9d6b5223806cda6cb0f2418cf8e095b9e122038f49c4134d2a8f1db5ffc5c"} Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.019940 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-jh8hv" event={"ID":"40ce0e8a-b3ee-4b5a-a3ef-ee0e4a573496","Type":"ContainerStarted","Data":"1a68a05d959cc7feeff7698f289866a306b73e28f6bb48fd3a8381653a9cadb2"} Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.019963 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-jh8hv" event={"ID":"40ce0e8a-b3ee-4b5a-a3ef-ee0e4a573496","Type":"ContainerStarted","Data":"a0f4eac5a88ebf72320261aa00485ae98521044c14e4a40822013ae0e6719635"} Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.021242 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" event={"ID":"98199f15-3582-4b64-a118-c97f9ddadd11","Type":"ContainerStarted","Data":"9f1d9aafe9c09a3e713a4bef33323e1c7ddbb4b5f951ca6454b9c997c0926608"} Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.021916 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrmzx" event={"ID":"004c9445-7b3b-4479-bf6e-e6d880e4c7bb","Type":"ContainerStarted","Data":"44d59f19cad3c4eb7cedc7ce28146ec22b2487fa1f51a5a5bd41ccbb40920bb5"} Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.022515 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crfxb" event={"ID":"a5b48e24-98c3-4875-9e6f-9d2de0463048","Type":"ContainerStarted","Data":"e39a6d22918ca385a9f96b89ba7c15ea9f8a0ccfef056f69f403de97d9ddf857"} Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.023204 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9621ba12-6602-4afb-80bb-116e84daef13-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pdhbv\" (UID: \"9621ba12-6602-4afb-80bb-116e84daef13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pdhbv" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.023556 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-5nrkx" event={"ID":"c0c1f48e-db09-49cd-82db-e687ea384d2c","Type":"ContainerStarted","Data":"d98b0746961aea5f5d1337453b0f50c0ca0710c0982e8d3eb7c722b722919911"} Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.024518 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn" event={"ID":"352c3d22-5aaa-47c1-a712-9022736511f4","Type":"ContainerStarted","Data":"01b6a9f87696aa4f105817ac62cb58fd08d725d2ac11edb8cc34da3806318eb2"} Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.025519 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-7qr8w" event={"ID":"bd925ec4-97eb-4780-82a0-e7dc782542ce","Type":"ContainerStarted","Data":"b1d18f631b44ec443a2225d9197c629d94f40441c94db0904614f568f4978d4f"} Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.035419 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vkxrc" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.044526 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pnvpf"] Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.091449 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlqr9\" (UniqueName: \"kubernetes.io/projected/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-kube-api-access-xlqr9\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.091485 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/825e4f26-ffbe-478b-a09b-7ffc436d4b4d-config\") pod \"kube-apiserver-operator-766d6c64bb-t8vg7\" (UID: \"825e4f26-ffbe-478b-a09b-7ffc436d4b4d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8vg7" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.091510 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.091554 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-registry-tls\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.091584 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.091635 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.091659 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8ll4\" (UniqueName: \"kubernetes.io/projected/15557932-56e7-4119-bdec-104aa40ae284-kube-api-access-c8ll4\") pod \"multus-admission-controller-857f4d67dd-gwcm9\" (UID: \"15557932-56e7-4119-bdec-104aa40ae284\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gwcm9" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.091680 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.091701 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.091719 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/825e4f26-ffbe-478b-a09b-7ffc436d4b4d-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-t8vg7\" (UID: \"825e4f26-ffbe-478b-a09b-7ffc436d4b4d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8vg7" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.091739 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-registry-certificates\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.091796 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/63d4244b-7a5e-4b4d-8fc0-52440b50b276-profile-collector-cert\") pod \"olm-operator-6b444d44fb-g6pd2\" (UID: \"63d4244b-7a5e-4b4d-8fc0-52440b50b276\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g6pd2" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.091820 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/48199fa0-dc2a-4df0-b3d9-dca85040f205-profile-collector-cert\") pod \"catalog-operator-68c6474976-jwml7\" (UID: \"48199fa0-dc2a-4df0-b3d9-dca85040f205\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwml7" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.091842 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/825e4f26-ffbe-478b-a09b-7ffc436d4b4d-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-t8vg7\" (UID: \"825e4f26-ffbe-478b-a09b-7ffc436d4b4d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8vg7" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.091888 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/48199fa0-dc2a-4df0-b3d9-dca85040f205-srv-cert\") pod \"catalog-operator-68c6474976-jwml7\" (UID: \"48199fa0-dc2a-4df0-b3d9-dca85040f205\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwml7" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.093060 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/63d4244b-7a5e-4b4d-8fc0-52440b50b276-srv-cert\") pod \"olm-operator-6b444d44fb-g6pd2\" (UID: \"63d4244b-7a5e-4b4d-8fc0-52440b50b276\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g6pd2" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.093108 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwprr\" (UniqueName: \"kubernetes.io/projected/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-kube-api-access-fwprr\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.093212 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.093274 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4rqz\" (UniqueName: \"kubernetes.io/projected/48199fa0-dc2a-4df0-b3d9-dca85040f205-kube-api-access-l4rqz\") pod \"catalog-operator-68c6474976-jwml7\" (UID: \"48199fa0-dc2a-4df0-b3d9-dca85040f205\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwml7" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.093302 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-audit-policies\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.093321 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.093361 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-audit-dir\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.093403 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.093428 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.093445 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95p52\" (UniqueName: \"kubernetes.io/projected/63d4244b-7a5e-4b4d-8fc0-52440b50b276-kube-api-access-95p52\") pod \"olm-operator-6b444d44fb-g6pd2\" (UID: \"63d4244b-7a5e-4b4d-8fc0-52440b50b276\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g6pd2" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.098075 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.098259 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-trusted-ca\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.098279 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: E1003 14:45:28.098636 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:28.598619372 +0000 UTC m=+151.187822814 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.098686 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-bound-sa-token\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.098735 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/15557932-56e7-4119-bdec-104aa40ae284-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gwcm9\" (UID: \"15557932-56e7-4119-bdec-104aa40ae284\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gwcm9" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.098798 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.098816 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.098855 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: W1003 14:45:28.116286 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b5a60df_c3e9_4553_92d2_0c8d8752d007.slice/crio-81e51fffa679690f98c5aec9832eb0569dfe056ed726915c968f273c9a0af75c WatchSource:0}: Error finding container 81e51fffa679690f98c5aec9832eb0569dfe056ed726915c968f273c9a0af75c: Status 404 returned error can't find the container with id 81e51fffa679690f98c5aec9832eb0569dfe056ed726915c968f273c9a0af75c Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.180475 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-vnvw7"] Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.199722 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l9c9f"] Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.199975 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-hw8fn" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.200756 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.200934 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t56nl\" (UniqueName: \"kubernetes.io/projected/5a4b24b0-5a45-41de-a84b-f7e9d4e773f1-kube-api-access-t56nl\") pod \"packageserver-d55dfcdfc-4rcc5\" (UID: \"5a4b24b0-5a45-41de-a84b-f7e9d4e773f1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4rcc5" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.200957 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhpsm\" (UniqueName: \"kubernetes.io/projected/951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9-kube-api-access-jhpsm\") pod \"collect-profiles-29325045-8kfbc\" (UID: \"951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-8kfbc" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.200982 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-bound-sa-token\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201001 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/15557932-56e7-4119-bdec-104aa40ae284-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gwcm9\" (UID: \"15557932-56e7-4119-bdec-104aa40ae284\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gwcm9" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201065 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpcrp\" (UniqueName: \"kubernetes.io/projected/b35e64bf-83d7-47de-b783-5b602815d90a-kube-api-access-wpcrp\") pod \"csi-hostpathplugin-jtjpd\" (UID: \"b35e64bf-83d7-47de-b783-5b602815d90a\") " pod="hostpath-provisioner/csi-hostpathplugin-jtjpd" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201081 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/087458d5-9967-4dda-8084-028ea5b92cd1-cert\") pod \"ingress-canary-dk4v5\" (UID: \"087458d5-9967-4dda-8084-028ea5b92cd1\") " pod="openshift-ingress-canary/ingress-canary-dk4v5" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201098 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pwqw\" (UniqueName: \"kubernetes.io/projected/087458d5-9967-4dda-8084-028ea5b92cd1-kube-api-access-6pwqw\") pod \"ingress-canary-dk4v5\" (UID: \"087458d5-9967-4dda-8084-028ea5b92cd1\") " pod="openshift-ingress-canary/ingress-canary-dk4v5" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201131 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201151 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/86a66a03-a1cd-4d49-84d5-392cfe63d1ae-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-jsbrr\" (UID: \"86a66a03-a1cd-4d49-84d5-392cfe63d1ae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jsbrr" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201170 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9-config-volume\") pod \"collect-profiles-29325045-8kfbc\" (UID: \"951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-8kfbc" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201201 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201219 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201235 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cc2761ec-8878-46ce-aea2-234dd179fbe6-images\") pod \"machine-config-operator-74547568cd-gwjdm\" (UID: \"cc2761ec-8878-46ce-aea2-234dd179fbe6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gwjdm" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201254 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67-config\") pod \"etcd-operator-b45778765-wjcbg\" (UID: \"8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wjcbg" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201276 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlqr9\" (UniqueName: \"kubernetes.io/projected/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-kube-api-access-xlqr9\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201325 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/825e4f26-ffbe-478b-a09b-7ffc436d4b4d-config\") pod \"kube-apiserver-operator-766d6c64bb-t8vg7\" (UID: \"825e4f26-ffbe-478b-a09b-7ffc436d4b4d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8vg7" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201345 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbt76\" (UniqueName: \"kubernetes.io/projected/98cf7a02-ed29-4cd4-9f60-77659f186e4b-kube-api-access-fbt76\") pod \"control-plane-machine-set-operator-78cbb6b69f-7qxcv\" (UID: \"98cf7a02-ed29-4cd4-9f60-77659f186e4b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7qxcv" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201387 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67-etcd-service-ca\") pod \"etcd-operator-b45778765-wjcbg\" (UID: \"8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wjcbg" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201406 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a4949a78-2546-4a0c-b280-871726da80af-metrics-tls\") pod \"dns-default-8pqx5\" (UID: \"a4949a78-2546-4a0c-b280-871726da80af\") " pod="openshift-dns/dns-default-8pqx5" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201437 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cc2761ec-8878-46ce-aea2-234dd179fbe6-proxy-tls\") pod \"machine-config-operator-74547568cd-gwjdm\" (UID: \"cc2761ec-8878-46ce-aea2-234dd179fbe6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gwjdm" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201469 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201488 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/67a98087-a2b3-457a-be93-8fd1203b825d-default-certificate\") pod \"router-default-5444994796-s946h\" (UID: \"67a98087-a2b3-457a-be93-8fd1203b825d\") " pod="openshift-ingress/router-default-5444994796-s946h" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201526 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88303d53-7218-4326-917e-c075e1ec1f12-config\") pod \"service-ca-operator-777779d784-kcqsn\" (UID: \"88303d53-7218-4326-917e-c075e1ec1f12\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kcqsn" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201545 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/923b3b0f-6810-4504-9757-fe4761f6ed37-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wq2sd\" (UID: \"923b3b0f-6810-4504-9757-fe4761f6ed37\") " pod="openshift-marketplace/marketplace-operator-79b997595-wq2sd" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201562 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a4949a78-2546-4a0c-b280-871726da80af-config-volume\") pod \"dns-default-8pqx5\" (UID: \"a4949a78-2546-4a0c-b280-871726da80af\") " pod="openshift-dns/dns-default-8pqx5" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201603 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cddbm\" (UniqueName: \"kubernetes.io/projected/88303d53-7218-4326-917e-c075e1ec1f12-kube-api-access-cddbm\") pod \"service-ca-operator-777779d784-kcqsn\" (UID: \"88303d53-7218-4326-917e-c075e1ec1f12\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kcqsn" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201630 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f27658be-3b8d-4706-9e91-831e832c34a7-config\") pod \"kube-controller-manager-operator-78b949d7b-s7km8\" (UID: \"f27658be-3b8d-4706-9e91-831e832c34a7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s7km8" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201651 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-registry-tls\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201676 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201718 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g49s\" (UniqueName: \"kubernetes.io/projected/e0b599ea-9f7f-483a-a10b-4a5eb0fedce6-kube-api-access-4g49s\") pod \"service-ca-9c57cc56f-dz4j8\" (UID: \"e0b599ea-9f7f-483a-a10b-4a5eb0fedce6\") " pod="openshift-service-ca/service-ca-9c57cc56f-dz4j8" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201772 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201791 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb6qz\" (UniqueName: \"kubernetes.io/projected/cc2761ec-8878-46ce-aea2-234dd179fbe6-kube-api-access-gb6qz\") pod \"machine-config-operator-74547568cd-gwjdm\" (UID: \"cc2761ec-8878-46ce-aea2-234dd179fbe6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gwjdm" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201821 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8ll4\" (UniqueName: \"kubernetes.io/projected/15557932-56e7-4119-bdec-104aa40ae284-kube-api-access-c8ll4\") pod \"multus-admission-controller-857f4d67dd-gwcm9\" (UID: \"15557932-56e7-4119-bdec-104aa40ae284\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gwcm9" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201838 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201859 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/825e4f26-ffbe-478b-a09b-7ffc436d4b4d-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-t8vg7\" (UID: \"825e4f26-ffbe-478b-a09b-7ffc436d4b4d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8vg7" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201878 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/67a98087-a2b3-457a-be93-8fd1203b825d-metrics-certs\") pod \"router-default-5444994796-s946h\" (UID: \"67a98087-a2b3-457a-be93-8fd1203b825d\") " pod="openshift-ingress/router-default-5444994796-s946h" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201908 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201938 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-registry-certificates\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201957 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5a4b24b0-5a45-41de-a84b-f7e9d4e773f1-tmpfs\") pod \"packageserver-d55dfcdfc-4rcc5\" (UID: \"5a4b24b0-5a45-41de-a84b-f7e9d4e773f1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4rcc5" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.201993 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67-etcd-ca\") pod \"etcd-operator-b45778765-wjcbg\" (UID: \"8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wjcbg" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.202010 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88303d53-7218-4326-917e-c075e1ec1f12-serving-cert\") pod \"service-ca-operator-777779d784-kcqsn\" (UID: \"88303d53-7218-4326-917e-c075e1ec1f12\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kcqsn" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.202041 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e0b599ea-9f7f-483a-a10b-4a5eb0fedce6-signing-key\") pod \"service-ca-9c57cc56f-dz4j8\" (UID: \"e0b599ea-9f7f-483a-a10b-4a5eb0fedce6\") " pod="openshift-service-ca/service-ca-9c57cc56f-dz4j8" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.202059 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e0b599ea-9f7f-483a-a10b-4a5eb0fedce6-signing-cabundle\") pod \"service-ca-9c57cc56f-dz4j8\" (UID: \"e0b599ea-9f7f-483a-a10b-4a5eb0fedce6\") " pod="openshift-service-ca/service-ca-9c57cc56f-dz4j8" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.202075 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f27658be-3b8d-4706-9e91-831e832c34a7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-s7km8\" (UID: \"f27658be-3b8d-4706-9e91-831e832c34a7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s7km8" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.202096 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/923b3b0f-6810-4504-9757-fe4761f6ed37-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wq2sd\" (UID: \"923b3b0f-6810-4504-9757-fe4761f6ed37\") " pod="openshift-marketplace/marketplace-operator-79b997595-wq2sd" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.202115 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b35e64bf-83d7-47de-b783-5b602815d90a-socket-dir\") pod \"csi-hostpathplugin-jtjpd\" (UID: \"b35e64bf-83d7-47de-b783-5b602815d90a\") " pod="hostpath-provisioner/csi-hostpathplugin-jtjpd" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.202133 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b35e64bf-83d7-47de-b783-5b602815d90a-mountpoint-dir\") pod \"csi-hostpathplugin-jtjpd\" (UID: \"b35e64bf-83d7-47de-b783-5b602815d90a\") " pod="hostpath-provisioner/csi-hostpathplugin-jtjpd" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.202177 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5a4b24b0-5a45-41de-a84b-f7e9d4e773f1-webhook-cert\") pod \"packageserver-d55dfcdfc-4rcc5\" (UID: \"5a4b24b0-5a45-41de-a84b-f7e9d4e773f1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4rcc5" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.202198 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/48199fa0-dc2a-4df0-b3d9-dca85040f205-profile-collector-cert\") pod \"catalog-operator-68c6474976-jwml7\" (UID: \"48199fa0-dc2a-4df0-b3d9-dca85040f205\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwml7" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.202217 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cc2761ec-8878-46ce-aea2-234dd179fbe6-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gwjdm\" (UID: \"cc2761ec-8878-46ce-aea2-234dd179fbe6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gwjdm" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.202248 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/63d4244b-7a5e-4b4d-8fc0-52440b50b276-profile-collector-cert\") pod \"olm-operator-6b444d44fb-g6pd2\" (UID: \"63d4244b-7a5e-4b4d-8fc0-52440b50b276\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g6pd2" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.202278 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/825e4f26-ffbe-478b-a09b-7ffc436d4b4d-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-t8vg7\" (UID: \"825e4f26-ffbe-478b-a09b-7ffc436d4b4d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8vg7" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.202306 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/48199fa0-dc2a-4df0-b3d9-dca85040f205-srv-cert\") pod \"catalog-operator-68c6474976-jwml7\" (UID: \"48199fa0-dc2a-4df0-b3d9-dca85040f205\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwml7" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.202326 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67-serving-cert\") pod \"etcd-operator-b45778765-wjcbg\" (UID: \"8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wjcbg" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.202357 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbhht\" (UniqueName: \"kubernetes.io/projected/a4949a78-2546-4a0c-b280-871726da80af-kube-api-access-kbhht\") pod \"dns-default-8pqx5\" (UID: \"a4949a78-2546-4a0c-b280-871726da80af\") " pod="openshift-dns/dns-default-8pqx5" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.204413 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5a4b24b0-5a45-41de-a84b-f7e9d4e773f1-apiservice-cert\") pod \"packageserver-d55dfcdfc-4rcc5\" (UID: \"5a4b24b0-5a45-41de-a84b-f7e9d4e773f1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4rcc5" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.204668 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qctl\" (UniqueName: \"kubernetes.io/projected/67a98087-a2b3-457a-be93-8fd1203b825d-kube-api-access-9qctl\") pod \"router-default-5444994796-s946h\" (UID: \"67a98087-a2b3-457a-be93-8fd1203b825d\") " pod="openshift-ingress/router-default-5444994796-s946h" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.204693 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/63d4244b-7a5e-4b4d-8fc0-52440b50b276-srv-cert\") pod \"olm-operator-6b444d44fb-g6pd2\" (UID: \"63d4244b-7a5e-4b4d-8fc0-52440b50b276\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g6pd2" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.204714 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cktv\" (UniqueName: \"kubernetes.io/projected/8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67-kube-api-access-8cktv\") pod \"etcd-operator-b45778765-wjcbg\" (UID: \"8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wjcbg" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.204760 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf8zs\" (UniqueName: \"kubernetes.io/projected/c27f3211-f4ba-485a-9bf7-edf913003803-kube-api-access-pf8zs\") pod \"machine-config-server-t8q7s\" (UID: \"c27f3211-f4ba-485a-9bf7-edf913003803\") " pod="openshift-machine-config-operator/machine-config-server-t8q7s" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.204779 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/67a98087-a2b3-457a-be93-8fd1203b825d-stats-auth\") pod \"router-default-5444994796-s946h\" (UID: \"67a98087-a2b3-457a-be93-8fd1203b825d\") " pod="openshift-ingress/router-default-5444994796-s946h" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.204817 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67-etcd-client\") pod \"etcd-operator-b45778765-wjcbg\" (UID: \"8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wjcbg" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.204841 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwprr\" (UniqueName: \"kubernetes.io/projected/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-kube-api-access-fwprr\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.204937 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.204958 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f27658be-3b8d-4706-9e91-831e832c34a7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-s7km8\" (UID: \"f27658be-3b8d-4706-9e91-831e832c34a7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s7km8" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.204993 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b35e64bf-83d7-47de-b783-5b602815d90a-plugins-dir\") pod \"csi-hostpathplugin-jtjpd\" (UID: \"b35e64bf-83d7-47de-b783-5b602815d90a\") " pod="hostpath-provisioner/csi-hostpathplugin-jtjpd" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.205018 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4rqz\" (UniqueName: \"kubernetes.io/projected/48199fa0-dc2a-4df0-b3d9-dca85040f205-kube-api-access-l4rqz\") pod \"catalog-operator-68c6474976-jwml7\" (UID: \"48199fa0-dc2a-4df0-b3d9-dca85040f205\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwml7" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.205066 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b35e64bf-83d7-47de-b783-5b602815d90a-registration-dir\") pod \"csi-hostpathplugin-jtjpd\" (UID: \"b35e64bf-83d7-47de-b783-5b602815d90a\") " pod="hostpath-provisioner/csi-hostpathplugin-jtjpd" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.205084 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-audit-policies\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.205105 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c27f3211-f4ba-485a-9bf7-edf913003803-node-bootstrap-token\") pod \"machine-config-server-t8q7s\" (UID: \"c27f3211-f4ba-485a-9bf7-edf913003803\") " pod="openshift-machine-config-operator/machine-config-server-t8q7s" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.205126 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7rtc\" (UniqueName: \"kubernetes.io/projected/923b3b0f-6810-4504-9757-fe4761f6ed37-kube-api-access-d7rtc\") pod \"marketplace-operator-79b997595-wq2sd\" (UID: \"923b3b0f-6810-4504-9757-fe4761f6ed37\") " pod="openshift-marketplace/marketplace-operator-79b997595-wq2sd" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.205172 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.205192 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67a98087-a2b3-457a-be93-8fd1203b825d-service-ca-bundle\") pod \"router-default-5444994796-s946h\" (UID: \"67a98087-a2b3-457a-be93-8fd1203b825d\") " pod="openshift-ingress/router-default-5444994796-s946h" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.205211 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-audit-dir\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.205257 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.205278 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd4x9\" (UniqueName: \"kubernetes.io/projected/86a66a03-a1cd-4d49-84d5-392cfe63d1ae-kube-api-access-fd4x9\") pod \"package-server-manager-789f6589d5-jsbrr\" (UID: \"86a66a03-a1cd-4d49-84d5-392cfe63d1ae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jsbrr" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.205312 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95p52\" (UniqueName: \"kubernetes.io/projected/63d4244b-7a5e-4b4d-8fc0-52440b50b276-kube-api-access-95p52\") pod \"olm-operator-6b444d44fb-g6pd2\" (UID: \"63d4244b-7a5e-4b4d-8fc0-52440b50b276\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g6pd2" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.205360 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.205399 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9-secret-volume\") pod \"collect-profiles-29325045-8kfbc\" (UID: \"951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-8kfbc" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.205422 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b35e64bf-83d7-47de-b783-5b602815d90a-csi-data-dir\") pod \"csi-hostpathplugin-jtjpd\" (UID: \"b35e64bf-83d7-47de-b783-5b602815d90a\") " pod="hostpath-provisioner/csi-hostpathplugin-jtjpd" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.205451 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c27f3211-f4ba-485a-9bf7-edf913003803-certs\") pod \"machine-config-server-t8q7s\" (UID: \"c27f3211-f4ba-485a-9bf7-edf913003803\") " pod="openshift-machine-config-operator/machine-config-server-t8q7s" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.205472 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/98cf7a02-ed29-4cd4-9f60-77659f186e4b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-7qxcv\" (UID: \"98cf7a02-ed29-4cd4-9f60-77659f186e4b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7qxcv" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.205545 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-trusted-ca\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.205567 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: E1003 14:45:28.205878 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:28.705858936 +0000 UTC m=+151.295062388 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.207657 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.213080 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.214653 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-audit-dir\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.214839 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.215230 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/825e4f26-ffbe-478b-a09b-7ffc436d4b4d-config\") pod \"kube-apiserver-operator-766d6c64bb-t8vg7\" (UID: \"825e4f26-ffbe-478b-a09b-7ffc436d4b4d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8vg7" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.215618 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-registry-certificates\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.216981 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-audit-policies\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.224533 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.225166 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-trusted-ca\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.226160 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/63d4244b-7a5e-4b4d-8fc0-52440b50b276-profile-collector-cert\") pod \"olm-operator-6b444d44fb-g6pd2\" (UID: \"63d4244b-7a5e-4b4d-8fc0-52440b50b276\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g6pd2" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.226195 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/48199fa0-dc2a-4df0-b3d9-dca85040f205-profile-collector-cert\") pod \"catalog-operator-68c6474976-jwml7\" (UID: \"48199fa0-dc2a-4df0-b3d9-dca85040f205\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwml7" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.226298 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.226313 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/15557932-56e7-4119-bdec-104aa40ae284-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gwcm9\" (UID: \"15557932-56e7-4119-bdec-104aa40ae284\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gwcm9" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.226523 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.226606 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zh5qn" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.226606 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-bound-sa-token\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.227032 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-registry-tls\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.227038 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.227133 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.227145 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.227150 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.227454 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/48199fa0-dc2a-4df0-b3d9-dca85040f205-srv-cert\") pod \"catalog-operator-68c6474976-jwml7\" (UID: \"48199fa0-dc2a-4df0-b3d9-dca85040f205\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwml7" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.227685 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.228473 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/63d4244b-7a5e-4b4d-8fc0-52440b50b276-srv-cert\") pod \"olm-operator-6b444d44fb-g6pd2\" (UID: \"63d4244b-7a5e-4b4d-8fc0-52440b50b276\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g6pd2" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.229736 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/825e4f26-ffbe-478b-a09b-7ffc436d4b4d-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-t8vg7\" (UID: \"825e4f26-ffbe-478b-a09b-7ffc436d4b4d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8vg7" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.263249 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95p52\" (UniqueName: \"kubernetes.io/projected/63d4244b-7a5e-4b4d-8fc0-52440b50b276-kube-api-access-95p52\") pod \"olm-operator-6b444d44fb-g6pd2\" (UID: \"63d4244b-7a5e-4b4d-8fc0-52440b50b276\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g6pd2" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.271171 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-pdnsv"] Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.281647 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.282701 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4rqz\" (UniqueName: \"kubernetes.io/projected/48199fa0-dc2a-4df0-b3d9-dca85040f205-kube-api-access-l4rqz\") pod \"catalog-operator-68c6474976-jwml7\" (UID: \"48199fa0-dc2a-4df0-b3d9-dca85040f205\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwml7" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.283885 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.289795 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pdhbv" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.301052 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwprr\" (UniqueName: \"kubernetes.io/projected/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-kube-api-access-fwprr\") pod \"oauth-openshift-558db77b4-p27cs\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.302152 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j7gl6"] Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.306093 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g6pd2" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.307139 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b35e64bf-83d7-47de-b783-5b602815d90a-plugins-dir\") pod \"csi-hostpathplugin-jtjpd\" (UID: \"b35e64bf-83d7-47de-b783-5b602815d90a\") " pod="hostpath-provisioner/csi-hostpathplugin-jtjpd" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.307187 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b35e64bf-83d7-47de-b783-5b602815d90a-registration-dir\") pod \"csi-hostpathplugin-jtjpd\" (UID: \"b35e64bf-83d7-47de-b783-5b602815d90a\") " pod="hostpath-provisioner/csi-hostpathplugin-jtjpd" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.307223 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c27f3211-f4ba-485a-9bf7-edf913003803-node-bootstrap-token\") pod \"machine-config-server-t8q7s\" (UID: \"c27f3211-f4ba-485a-9bf7-edf913003803\") " pod="openshift-machine-config-operator/machine-config-server-t8q7s" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.307249 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67a98087-a2b3-457a-be93-8fd1203b825d-service-ca-bundle\") pod \"router-default-5444994796-s946h\" (UID: \"67a98087-a2b3-457a-be93-8fd1203b825d\") " pod="openshift-ingress/router-default-5444994796-s946h" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.307272 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7rtc\" (UniqueName: \"kubernetes.io/projected/923b3b0f-6810-4504-9757-fe4761f6ed37-kube-api-access-d7rtc\") pod \"marketplace-operator-79b997595-wq2sd\" (UID: \"923b3b0f-6810-4504-9757-fe4761f6ed37\") " pod="openshift-marketplace/marketplace-operator-79b997595-wq2sd" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.307307 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd4x9\" (UniqueName: \"kubernetes.io/projected/86a66a03-a1cd-4d49-84d5-392cfe63d1ae-kube-api-access-fd4x9\") pod \"package-server-manager-789f6589d5-jsbrr\" (UID: \"86a66a03-a1cd-4d49-84d5-392cfe63d1ae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jsbrr" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.307334 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.307358 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9-secret-volume\") pod \"collect-profiles-29325045-8kfbc\" (UID: \"951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-8kfbc" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.307401 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b35e64bf-83d7-47de-b783-5b602815d90a-csi-data-dir\") pod \"csi-hostpathplugin-jtjpd\" (UID: \"b35e64bf-83d7-47de-b783-5b602815d90a\") " pod="hostpath-provisioner/csi-hostpathplugin-jtjpd" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.307428 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/98cf7a02-ed29-4cd4-9f60-77659f186e4b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-7qxcv\" (UID: \"98cf7a02-ed29-4cd4-9f60-77659f186e4b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7qxcv" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.307451 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c27f3211-f4ba-485a-9bf7-edf913003803-certs\") pod \"machine-config-server-t8q7s\" (UID: \"c27f3211-f4ba-485a-9bf7-edf913003803\") " pod="openshift-machine-config-operator/machine-config-server-t8q7s" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.307476 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t56nl\" (UniqueName: \"kubernetes.io/projected/5a4b24b0-5a45-41de-a84b-f7e9d4e773f1-kube-api-access-t56nl\") pod \"packageserver-d55dfcdfc-4rcc5\" (UID: \"5a4b24b0-5a45-41de-a84b-f7e9d4e773f1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4rcc5" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.307499 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhpsm\" (UniqueName: \"kubernetes.io/projected/951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9-kube-api-access-jhpsm\") pod \"collect-profiles-29325045-8kfbc\" (UID: \"951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-8kfbc" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.307525 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpcrp\" (UniqueName: \"kubernetes.io/projected/b35e64bf-83d7-47de-b783-5b602815d90a-kube-api-access-wpcrp\") pod \"csi-hostpathplugin-jtjpd\" (UID: \"b35e64bf-83d7-47de-b783-5b602815d90a\") " pod="hostpath-provisioner/csi-hostpathplugin-jtjpd" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.307546 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/087458d5-9967-4dda-8084-028ea5b92cd1-cert\") pod \"ingress-canary-dk4v5\" (UID: \"087458d5-9967-4dda-8084-028ea5b92cd1\") " pod="openshift-ingress-canary/ingress-canary-dk4v5" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.307565 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pwqw\" (UniqueName: \"kubernetes.io/projected/087458d5-9967-4dda-8084-028ea5b92cd1-kube-api-access-6pwqw\") pod \"ingress-canary-dk4v5\" (UID: \"087458d5-9967-4dda-8084-028ea5b92cd1\") " pod="openshift-ingress-canary/ingress-canary-dk4v5" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.307585 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cc2761ec-8878-46ce-aea2-234dd179fbe6-images\") pod \"machine-config-operator-74547568cd-gwjdm\" (UID: \"cc2761ec-8878-46ce-aea2-234dd179fbe6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gwjdm" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.307608 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/86a66a03-a1cd-4d49-84d5-392cfe63d1ae-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-jsbrr\" (UID: \"86a66a03-a1cd-4d49-84d5-392cfe63d1ae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jsbrr" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.307628 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9-config-volume\") pod \"collect-profiles-29325045-8kfbc\" (UID: \"951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-8kfbc" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.307648 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67-config\") pod \"etcd-operator-b45778765-wjcbg\" (UID: \"8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wjcbg" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.307678 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbt76\" (UniqueName: \"kubernetes.io/projected/98cf7a02-ed29-4cd4-9f60-77659f186e4b-kube-api-access-fbt76\") pod \"control-plane-machine-set-operator-78cbb6b69f-7qxcv\" (UID: \"98cf7a02-ed29-4cd4-9f60-77659f186e4b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7qxcv" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.307702 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67-etcd-service-ca\") pod \"etcd-operator-b45778765-wjcbg\" (UID: \"8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wjcbg" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.307724 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a4949a78-2546-4a0c-b280-871726da80af-metrics-tls\") pod \"dns-default-8pqx5\" (UID: \"a4949a78-2546-4a0c-b280-871726da80af\") " pod="openshift-dns/dns-default-8pqx5" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.307747 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/67a98087-a2b3-457a-be93-8fd1203b825d-default-certificate\") pod \"router-default-5444994796-s946h\" (UID: \"67a98087-a2b3-457a-be93-8fd1203b825d\") " pod="openshift-ingress/router-default-5444994796-s946h" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.307766 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cc2761ec-8878-46ce-aea2-234dd179fbe6-proxy-tls\") pod \"machine-config-operator-74547568cd-gwjdm\" (UID: \"cc2761ec-8878-46ce-aea2-234dd179fbe6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gwjdm" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.307802 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88303d53-7218-4326-917e-c075e1ec1f12-config\") pod \"service-ca-operator-777779d784-kcqsn\" (UID: \"88303d53-7218-4326-917e-c075e1ec1f12\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kcqsn" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.307823 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/923b3b0f-6810-4504-9757-fe4761f6ed37-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wq2sd\" (UID: \"923b3b0f-6810-4504-9757-fe4761f6ed37\") " pod="openshift-marketplace/marketplace-operator-79b997595-wq2sd" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.307843 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a4949a78-2546-4a0c-b280-871726da80af-config-volume\") pod \"dns-default-8pqx5\" (UID: \"a4949a78-2546-4a0c-b280-871726da80af\") " pod="openshift-dns/dns-default-8pqx5" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.307864 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cddbm\" (UniqueName: \"kubernetes.io/projected/88303d53-7218-4326-917e-c075e1ec1f12-kube-api-access-cddbm\") pod \"service-ca-operator-777779d784-kcqsn\" (UID: \"88303d53-7218-4326-917e-c075e1ec1f12\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kcqsn" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.307887 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f27658be-3b8d-4706-9e91-831e832c34a7-config\") pod \"kube-controller-manager-operator-78b949d7b-s7km8\" (UID: \"f27658be-3b8d-4706-9e91-831e832c34a7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s7km8" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.307912 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g49s\" (UniqueName: \"kubernetes.io/projected/e0b599ea-9f7f-483a-a10b-4a5eb0fedce6-kube-api-access-4g49s\") pod \"service-ca-9c57cc56f-dz4j8\" (UID: \"e0b599ea-9f7f-483a-a10b-4a5eb0fedce6\") " pod="openshift-service-ca/service-ca-9c57cc56f-dz4j8" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.307952 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb6qz\" (UniqueName: \"kubernetes.io/projected/cc2761ec-8878-46ce-aea2-234dd179fbe6-kube-api-access-gb6qz\") pod \"machine-config-operator-74547568cd-gwjdm\" (UID: \"cc2761ec-8878-46ce-aea2-234dd179fbe6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gwjdm" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.307975 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/67a98087-a2b3-457a-be93-8fd1203b825d-metrics-certs\") pod \"router-default-5444994796-s946h\" (UID: \"67a98087-a2b3-457a-be93-8fd1203b825d\") " pod="openshift-ingress/router-default-5444994796-s946h" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.307999 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5a4b24b0-5a45-41de-a84b-f7e9d4e773f1-tmpfs\") pod \"packageserver-d55dfcdfc-4rcc5\" (UID: \"5a4b24b0-5a45-41de-a84b-f7e9d4e773f1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4rcc5" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.308020 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88303d53-7218-4326-917e-c075e1ec1f12-serving-cert\") pod \"service-ca-operator-777779d784-kcqsn\" (UID: \"88303d53-7218-4326-917e-c075e1ec1f12\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kcqsn" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.308040 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67-etcd-ca\") pod \"etcd-operator-b45778765-wjcbg\" (UID: \"8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wjcbg" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.308060 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e0b599ea-9f7f-483a-a10b-4a5eb0fedce6-signing-key\") pod \"service-ca-9c57cc56f-dz4j8\" (UID: \"e0b599ea-9f7f-483a-a10b-4a5eb0fedce6\") " pod="openshift-service-ca/service-ca-9c57cc56f-dz4j8" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.308080 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e0b599ea-9f7f-483a-a10b-4a5eb0fedce6-signing-cabundle\") pod \"service-ca-9c57cc56f-dz4j8\" (UID: \"e0b599ea-9f7f-483a-a10b-4a5eb0fedce6\") " pod="openshift-service-ca/service-ca-9c57cc56f-dz4j8" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.308100 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f27658be-3b8d-4706-9e91-831e832c34a7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-s7km8\" (UID: \"f27658be-3b8d-4706-9e91-831e832c34a7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s7km8" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.308121 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/923b3b0f-6810-4504-9757-fe4761f6ed37-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wq2sd\" (UID: \"923b3b0f-6810-4504-9757-fe4761f6ed37\") " pod="openshift-marketplace/marketplace-operator-79b997595-wq2sd" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.308142 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5a4b24b0-5a45-41de-a84b-f7e9d4e773f1-webhook-cert\") pod \"packageserver-d55dfcdfc-4rcc5\" (UID: \"5a4b24b0-5a45-41de-a84b-f7e9d4e773f1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4rcc5" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.308167 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b35e64bf-83d7-47de-b783-5b602815d90a-socket-dir\") pod \"csi-hostpathplugin-jtjpd\" (UID: \"b35e64bf-83d7-47de-b783-5b602815d90a\") " pod="hostpath-provisioner/csi-hostpathplugin-jtjpd" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.308189 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b35e64bf-83d7-47de-b783-5b602815d90a-mountpoint-dir\") pod \"csi-hostpathplugin-jtjpd\" (UID: \"b35e64bf-83d7-47de-b783-5b602815d90a\") " pod="hostpath-provisioner/csi-hostpathplugin-jtjpd" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.308220 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cc2761ec-8878-46ce-aea2-234dd179fbe6-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gwjdm\" (UID: \"cc2761ec-8878-46ce-aea2-234dd179fbe6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gwjdm" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.308255 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67-serving-cert\") pod \"etcd-operator-b45778765-wjcbg\" (UID: \"8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wjcbg" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.308279 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qctl\" (UniqueName: \"kubernetes.io/projected/67a98087-a2b3-457a-be93-8fd1203b825d-kube-api-access-9qctl\") pod \"router-default-5444994796-s946h\" (UID: \"67a98087-a2b3-457a-be93-8fd1203b825d\") " pod="openshift-ingress/router-default-5444994796-s946h" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.308301 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbhht\" (UniqueName: \"kubernetes.io/projected/a4949a78-2546-4a0c-b280-871726da80af-kube-api-access-kbhht\") pod \"dns-default-8pqx5\" (UID: \"a4949a78-2546-4a0c-b280-871726da80af\") " pod="openshift-dns/dns-default-8pqx5" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.308321 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5a4b24b0-5a45-41de-a84b-f7e9d4e773f1-apiservice-cert\") pod \"packageserver-d55dfcdfc-4rcc5\" (UID: \"5a4b24b0-5a45-41de-a84b-f7e9d4e773f1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4rcc5" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.308361 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cktv\" (UniqueName: \"kubernetes.io/projected/8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67-kube-api-access-8cktv\") pod \"etcd-operator-b45778765-wjcbg\" (UID: \"8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wjcbg" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.308415 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf8zs\" (UniqueName: \"kubernetes.io/projected/c27f3211-f4ba-485a-9bf7-edf913003803-kube-api-access-pf8zs\") pod \"machine-config-server-t8q7s\" (UID: \"c27f3211-f4ba-485a-9bf7-edf913003803\") " pod="openshift-machine-config-operator/machine-config-server-t8q7s" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.308444 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/67a98087-a2b3-457a-be93-8fd1203b825d-stats-auth\") pod \"router-default-5444994796-s946h\" (UID: \"67a98087-a2b3-457a-be93-8fd1203b825d\") " pod="openshift-ingress/router-default-5444994796-s946h" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.308517 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67-etcd-client\") pod \"etcd-operator-b45778765-wjcbg\" (UID: \"8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wjcbg" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.308564 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f27658be-3b8d-4706-9e91-831e832c34a7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-s7km8\" (UID: \"f27658be-3b8d-4706-9e91-831e832c34a7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s7km8" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.308923 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b35e64bf-83d7-47de-b783-5b602815d90a-plugins-dir\") pod \"csi-hostpathplugin-jtjpd\" (UID: \"b35e64bf-83d7-47de-b783-5b602815d90a\") " pod="hostpath-provisioner/csi-hostpathplugin-jtjpd" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.309017 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b35e64bf-83d7-47de-b783-5b602815d90a-registration-dir\") pod \"csi-hostpathplugin-jtjpd\" (UID: \"b35e64bf-83d7-47de-b783-5b602815d90a\") " pod="hostpath-provisioner/csi-hostpathplugin-jtjpd" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.311285 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e0b599ea-9f7f-483a-a10b-4a5eb0fedce6-signing-key\") pod \"service-ca-9c57cc56f-dz4j8\" (UID: \"e0b599ea-9f7f-483a-a10b-4a5eb0fedce6\") " pod="openshift-service-ca/service-ca-9c57cc56f-dz4j8" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.311933 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f27658be-3b8d-4706-9e91-831e832c34a7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-s7km8\" (UID: \"f27658be-3b8d-4706-9e91-831e832c34a7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s7km8" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.312695 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88303d53-7218-4326-917e-c075e1ec1f12-config\") pod \"service-ca-operator-777779d784-kcqsn\" (UID: \"88303d53-7218-4326-917e-c075e1ec1f12\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kcqsn" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.312874 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e0b599ea-9f7f-483a-a10b-4a5eb0fedce6-signing-cabundle\") pod \"service-ca-9c57cc56f-dz4j8\" (UID: \"e0b599ea-9f7f-483a-a10b-4a5eb0fedce6\") " pod="openshift-service-ca/service-ca-9c57cc56f-dz4j8" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.313594 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/67a98087-a2b3-457a-be93-8fd1203b825d-default-certificate\") pod \"router-default-5444994796-s946h\" (UID: \"67a98087-a2b3-457a-be93-8fd1203b825d\") " pod="openshift-ingress/router-default-5444994796-s946h" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.313710 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67a98087-a2b3-457a-be93-8fd1203b825d-service-ca-bundle\") pod \"router-default-5444994796-s946h\" (UID: \"67a98087-a2b3-457a-be93-8fd1203b825d\") " pod="openshift-ingress/router-default-5444994796-s946h" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.313849 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b35e64bf-83d7-47de-b783-5b602815d90a-csi-data-dir\") pod \"csi-hostpathplugin-jtjpd\" (UID: \"b35e64bf-83d7-47de-b783-5b602815d90a\") " pod="hostpath-provisioner/csi-hostpathplugin-jtjpd" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.313937 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/923b3b0f-6810-4504-9757-fe4761f6ed37-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wq2sd\" (UID: \"923b3b0f-6810-4504-9757-fe4761f6ed37\") " pod="openshift-marketplace/marketplace-operator-79b997595-wq2sd" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.314397 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b35e64bf-83d7-47de-b783-5b602815d90a-socket-dir\") pod \"csi-hostpathplugin-jtjpd\" (UID: \"b35e64bf-83d7-47de-b783-5b602815d90a\") " pod="hostpath-provisioner/csi-hostpathplugin-jtjpd" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.314593 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cc2761ec-8878-46ce-aea2-234dd179fbe6-images\") pod \"machine-config-operator-74547568cd-gwjdm\" (UID: \"cc2761ec-8878-46ce-aea2-234dd179fbe6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gwjdm" Oct 03 14:45:28 crc kubenswrapper[4774]: E1003 14:45:28.314786 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:28.814769442 +0000 UTC m=+151.403973044 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.315509 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b35e64bf-83d7-47de-b783-5b602815d90a-mountpoint-dir\") pod \"csi-hostpathplugin-jtjpd\" (UID: \"b35e64bf-83d7-47de-b783-5b602815d90a\") " pod="hostpath-provisioner/csi-hostpathplugin-jtjpd" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.315975 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a4949a78-2546-4a0c-b280-871726da80af-config-volume\") pod \"dns-default-8pqx5\" (UID: \"a4949a78-2546-4a0c-b280-871726da80af\") " pod="openshift-dns/dns-default-8pqx5" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.316392 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cc2761ec-8878-46ce-aea2-234dd179fbe6-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gwjdm\" (UID: \"cc2761ec-8878-46ce-aea2-234dd179fbe6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gwjdm" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.316441 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67-serving-cert\") pod \"etcd-operator-b45778765-wjcbg\" (UID: \"8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wjcbg" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.316742 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9-config-volume\") pod \"collect-profiles-29325045-8kfbc\" (UID: \"951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-8kfbc" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.316791 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f27658be-3b8d-4706-9e91-831e832c34a7-config\") pod \"kube-controller-manager-operator-78b949d7b-s7km8\" (UID: \"f27658be-3b8d-4706-9e91-831e832c34a7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s7km8" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.317460 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67-config\") pod \"etcd-operator-b45778765-wjcbg\" (UID: \"8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wjcbg" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.317484 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67-etcd-service-ca\") pod \"etcd-operator-b45778765-wjcbg\" (UID: \"8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wjcbg" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.317469 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67-etcd-ca\") pod \"etcd-operator-b45778765-wjcbg\" (UID: \"8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wjcbg" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.317899 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cc2761ec-8878-46ce-aea2-234dd179fbe6-proxy-tls\") pod \"machine-config-operator-74547568cd-gwjdm\" (UID: \"cc2761ec-8878-46ce-aea2-234dd179fbe6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gwjdm" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.319050 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/67a98087-a2b3-457a-be93-8fd1203b825d-metrics-certs\") pod \"router-default-5444994796-s946h\" (UID: \"67a98087-a2b3-457a-be93-8fd1203b825d\") " pod="openshift-ingress/router-default-5444994796-s946h" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.319738 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/087458d5-9967-4dda-8084-028ea5b92cd1-cert\") pod \"ingress-canary-dk4v5\" (UID: \"087458d5-9967-4dda-8084-028ea5b92cd1\") " pod="openshift-ingress-canary/ingress-canary-dk4v5" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.319993 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/86a66a03-a1cd-4d49-84d5-392cfe63d1ae-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-jsbrr\" (UID: \"86a66a03-a1cd-4d49-84d5-392cfe63d1ae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jsbrr" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.320110 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c27f3211-f4ba-485a-9bf7-edf913003803-certs\") pod \"machine-config-server-t8q7s\" (UID: \"c27f3211-f4ba-485a-9bf7-edf913003803\") " pod="openshift-machine-config-operator/machine-config-server-t8q7s" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.320191 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/67a98087-a2b3-457a-be93-8fd1203b825d-stats-auth\") pod \"router-default-5444994796-s946h\" (UID: \"67a98087-a2b3-457a-be93-8fd1203b825d\") " pod="openshift-ingress/router-default-5444994796-s946h" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.320768 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c27f3211-f4ba-485a-9bf7-edf913003803-node-bootstrap-token\") pod \"machine-config-server-t8q7s\" (UID: \"c27f3211-f4ba-485a-9bf7-edf913003803\") " pod="openshift-machine-config-operator/machine-config-server-t8q7s" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.320794 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a4949a78-2546-4a0c-b280-871726da80af-metrics-tls\") pod \"dns-default-8pqx5\" (UID: \"a4949a78-2546-4a0c-b280-871726da80af\") " pod="openshift-dns/dns-default-8pqx5" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.320876 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9-secret-volume\") pod \"collect-profiles-29325045-8kfbc\" (UID: \"951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-8kfbc" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.321285 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67-etcd-client\") pod \"etcd-operator-b45778765-wjcbg\" (UID: \"8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wjcbg" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.321305 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88303d53-7218-4326-917e-c075e1ec1f12-serving-cert\") pod \"service-ca-operator-777779d784-kcqsn\" (UID: \"88303d53-7218-4326-917e-c075e1ec1f12\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kcqsn" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.321444 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/98cf7a02-ed29-4cd4-9f60-77659f186e4b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-7qxcv\" (UID: \"98cf7a02-ed29-4cd4-9f60-77659f186e4b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7qxcv" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.321551 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/923b3b0f-6810-4504-9757-fe4761f6ed37-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wq2sd\" (UID: \"923b3b0f-6810-4504-9757-fe4761f6ed37\") " pod="openshift-marketplace/marketplace-operator-79b997595-wq2sd" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.323930 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlqr9\" (UniqueName: \"kubernetes.io/projected/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-kube-api-access-xlqr9\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.327154 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwml7" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.338179 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8ll4\" (UniqueName: \"kubernetes.io/projected/15557932-56e7-4119-bdec-104aa40ae284-kube-api-access-c8ll4\") pod \"multus-admission-controller-857f4d67dd-gwcm9\" (UID: \"15557932-56e7-4119-bdec-104aa40ae284\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gwcm9" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.357438 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/825e4f26-ffbe-478b-a09b-7ffc436d4b4d-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-t8vg7\" (UID: \"825e4f26-ffbe-478b-a09b-7ffc436d4b4d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8vg7" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.370065 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5a4b24b0-5a45-41de-a84b-f7e9d4e773f1-tmpfs\") pod \"packageserver-d55dfcdfc-4rcc5\" (UID: \"5a4b24b0-5a45-41de-a84b-f7e9d4e773f1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4rcc5" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.370642 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5a4b24b0-5a45-41de-a84b-f7e9d4e773f1-webhook-cert\") pod \"packageserver-d55dfcdfc-4rcc5\" (UID: \"5a4b24b0-5a45-41de-a84b-f7e9d4e773f1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4rcc5" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.371027 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5a4b24b0-5a45-41de-a84b-f7e9d4e773f1-apiservice-cert\") pod \"packageserver-d55dfcdfc-4rcc5\" (UID: \"5a4b24b0-5a45-41de-a84b-f7e9d4e773f1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4rcc5" Oct 03 14:45:28 crc kubenswrapper[4774]: W1003 14:45:28.383254 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cae35f2_fcf0_4014_9b5b_9887d416e8d3.slice/crio-b6c407d268222c6dab8c929e3b39f6c883db20758277461ca5ebab772f9fe318 WatchSource:0}: Error finding container b6c407d268222c6dab8c929e3b39f6c883db20758277461ca5ebab772f9fe318: Status 404 returned error can't find the container with id b6c407d268222c6dab8c929e3b39f6c883db20758277461ca5ebab772f9fe318 Oct 03 14:45:28 crc kubenswrapper[4774]: W1003 14:45:28.384605 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40de7d67_3283_492a_bfc8_65c83c19421f.slice/crio-2e114a7df4f8ca6dcc12c02a3346243b3282714593618a14003aa3cc64e007a7 WatchSource:0}: Error finding container 2e114a7df4f8ca6dcc12c02a3346243b3282714593618a14003aa3cc64e007a7: Status 404 returned error can't find the container with id 2e114a7df4f8ca6dcc12c02a3346243b3282714593618a14003aa3cc64e007a7 Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.400968 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t56nl\" (UniqueName: \"kubernetes.io/projected/5a4b24b0-5a45-41de-a84b-f7e9d4e773f1-kube-api-access-t56nl\") pod \"packageserver-d55dfcdfc-4rcc5\" (UID: \"5a4b24b0-5a45-41de-a84b-f7e9d4e773f1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4rcc5" Oct 03 14:45:28 crc kubenswrapper[4774]: W1003 14:45:28.401047 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a48eac5_1dc7_4478_ac71_084ad6302324.slice/crio-b73e912f7d99d38ae3aa6e5d12bb260c9faa81e7448f842f0328f17ce4350425 WatchSource:0}: Error finding container b73e912f7d99d38ae3aa6e5d12bb260c9faa81e7448f842f0328f17ce4350425: Status 404 returned error can't find the container with id b73e912f7d99d38ae3aa6e5d12bb260c9faa81e7448f842f0328f17ce4350425 Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.409664 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:28 crc kubenswrapper[4774]: E1003 14:45:28.409856 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:28.909814415 +0000 UTC m=+151.499017867 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.410231 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:28 crc kubenswrapper[4774]: E1003 14:45:28.410672 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:28.910649531 +0000 UTC m=+151.499853153 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.424849 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4rcc5" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.468403 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qctl\" (UniqueName: \"kubernetes.io/projected/67a98087-a2b3-457a-be93-8fd1203b825d-kube-api-access-9qctl\") pod \"router-default-5444994796-s946h\" (UID: \"67a98087-a2b3-457a-be93-8fd1203b825d\") " pod="openshift-ingress/router-default-5444994796-s946h" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.484148 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbhht\" (UniqueName: \"kubernetes.io/projected/a4949a78-2546-4a0c-b280-871726da80af-kube-api-access-kbhht\") pod \"dns-default-8pqx5\" (UID: \"a4949a78-2546-4a0c-b280-871726da80af\") " pod="openshift-dns/dns-default-8pqx5" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.492157 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.501469 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cktv\" (UniqueName: \"kubernetes.io/projected/8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67-kube-api-access-8cktv\") pod \"etcd-operator-b45778765-wjcbg\" (UID: \"8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wjcbg" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.510944 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:28 crc kubenswrapper[4774]: E1003 14:45:28.511283 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:29.011268878 +0000 UTC m=+151.600472330 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.520695 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pwqw\" (UniqueName: \"kubernetes.io/projected/087458d5-9967-4dda-8084-028ea5b92cd1-kube-api-access-6pwqw\") pod \"ingress-canary-dk4v5\" (UID: \"087458d5-9967-4dda-8084-028ea5b92cd1\") " pod="openshift-ingress-canary/ingress-canary-dk4v5" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.533225 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8vg7" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.538432 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf8zs\" (UniqueName: \"kubernetes.io/projected/c27f3211-f4ba-485a-9bf7-edf913003803-kube-api-access-pf8zs\") pod \"machine-config-server-t8q7s\" (UID: \"c27f3211-f4ba-485a-9bf7-edf913003803\") " pod="openshift-machine-config-operator/machine-config-server-t8q7s" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.546806 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-gwcm9" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.559991 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f27658be-3b8d-4706-9e91-831e832c34a7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-s7km8\" (UID: \"f27658be-3b8d-4706-9e91-831e832c34a7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s7km8" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.566175 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jmqgl"] Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.580695 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7rtc\" (UniqueName: \"kubernetes.io/projected/923b3b0f-6810-4504-9757-fe4761f6ed37-kube-api-access-d7rtc\") pod \"marketplace-operator-79b997595-wq2sd\" (UID: \"923b3b0f-6810-4504-9757-fe4761f6ed37\") " pod="openshift-marketplace/marketplace-operator-79b997595-wq2sd" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.588435 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-vkxrc"] Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.602626 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd4x9\" (UniqueName: \"kubernetes.io/projected/86a66a03-a1cd-4d49-84d5-392cfe63d1ae-kube-api-access-fd4x9\") pod \"package-server-manager-789f6589d5-jsbrr\" (UID: \"86a66a03-a1cd-4d49-84d5-392cfe63d1ae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jsbrr" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.607060 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7z59k"] Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.612290 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:28 crc kubenswrapper[4774]: E1003 14:45:28.612884 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:29.112862655 +0000 UTC m=+151.702066117 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.650576 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jsbrr" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.652954 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhpsm\" (UniqueName: \"kubernetes.io/projected/951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9-kube-api-access-jhpsm\") pod \"collect-profiles-29325045-8kfbc\" (UID: \"951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-8kfbc" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.654142 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbt76\" (UniqueName: \"kubernetes.io/projected/98cf7a02-ed29-4cd4-9f60-77659f186e4b-kube-api-access-fbt76\") pod \"control-plane-machine-set-operator-78cbb6b69f-7qxcv\" (UID: \"98cf7a02-ed29-4cd4-9f60-77659f186e4b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7qxcv" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.656963 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g49s\" (UniqueName: \"kubernetes.io/projected/e0b599ea-9f7f-483a-a10b-4a5eb0fedce6-kube-api-access-4g49s\") pod \"service-ca-9c57cc56f-dz4j8\" (UID: \"e0b599ea-9f7f-483a-a10b-4a5eb0fedce6\") " pod="openshift-service-ca/service-ca-9c57cc56f-dz4j8" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.657766 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cddbm\" (UniqueName: \"kubernetes.io/projected/88303d53-7218-4326-917e-c075e1ec1f12-kube-api-access-cddbm\") pod \"service-ca-operator-777779d784-kcqsn\" (UID: \"88303d53-7218-4326-917e-c075e1ec1f12\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kcqsn" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.657935 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-s946h" Oct 03 14:45:28 crc kubenswrapper[4774]: W1003 14:45:28.672560 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07e852a6_9dbf_4533_85ad_d64d008bf488.slice/crio-53e51092bf7286ba31dd859924364447e9de922f24ed8b4a85db02fa9cb8a884 WatchSource:0}: Error finding container 53e51092bf7286ba31dd859924364447e9de922f24ed8b4a85db02fa9cb8a884: Status 404 returned error can't find the container with id 53e51092bf7286ba31dd859924364447e9de922f24ed8b4a85db02fa9cb8a884 Oct 03 14:45:28 crc kubenswrapper[4774]: W1003 14:45:28.673234 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c0571f5_aaf0_4189_a20c_66f5b173ea49.slice/crio-81b4d2bdd46a6ad3f8fff16540e9b57e6a0d08dd9e9e64bff920d79fb89dc5d9 WatchSource:0}: Error finding container 81b4d2bdd46a6ad3f8fff16540e9b57e6a0d08dd9e9e64bff920d79fb89dc5d9: Status 404 returned error can't find the container with id 81b4d2bdd46a6ad3f8fff16540e9b57e6a0d08dd9e9e64bff920d79fb89dc5d9 Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.675059 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpcrp\" (UniqueName: \"kubernetes.io/projected/b35e64bf-83d7-47de-b783-5b602815d90a-kube-api-access-wpcrp\") pod \"csi-hostpathplugin-jtjpd\" (UID: \"b35e64bf-83d7-47de-b783-5b602815d90a\") " pod="hostpath-provisioner/csi-hostpathplugin-jtjpd" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.675059 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s7km8" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.677017 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb6qz\" (UniqueName: \"kubernetes.io/projected/cc2761ec-8878-46ce-aea2-234dd179fbe6-kube-api-access-gb6qz\") pod \"machine-config-operator-74547568cd-gwjdm\" (UID: \"cc2761ec-8878-46ce-aea2-234dd179fbe6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gwjdm" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.681642 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-wjcbg" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.690298 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-kcqsn" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.698647 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wq2sd" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.706945 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-dz4j8" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.713889 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.714116 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-8kfbc" Oct 03 14:45:28 crc kubenswrapper[4774]: E1003 14:45:28.714345 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:29.214323488 +0000 UTC m=+151.803526950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.735591 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dk4v5" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.755037 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8pqx5" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.759558 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-t8q7s" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.782694 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-jtjpd" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.815601 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:28 crc kubenswrapper[4774]: E1003 14:45:28.815968 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:29.315956827 +0000 UTC m=+151.905160279 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.916768 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:28 crc kubenswrapper[4774]: E1003 14:45:28.917070 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:29.417033428 +0000 UTC m=+152.006236910 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.917629 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:28 crc kubenswrapper[4774]: E1003 14:45:28.917906 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:29.417894415 +0000 UTC m=+152.007097867 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.942900 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7qxcv" Oct 03 14:45:28 crc kubenswrapper[4774]: I1003 14:45:28.967888 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gwjdm" Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.022820 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:29 crc kubenswrapper[4774]: E1003 14:45:29.023172 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:29.523137846 +0000 UTC m=+152.112341318 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.023450 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:29 crc kubenswrapper[4774]: E1003 14:45:29.023920 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:29.52390453 +0000 UTC m=+152.113107982 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.030834 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7z59k" event={"ID":"6c0571f5-aaf0-4189-a20c-66f5b173ea49","Type":"ContainerStarted","Data":"81b4d2bdd46a6ad3f8fff16540e9b57e6a0d08dd9e9e64bff920d79fb89dc5d9"} Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.033856 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j7gl6" event={"ID":"5a48eac5-1dc7-4478-ac71-084ad6302324","Type":"ContainerStarted","Data":"b73e912f7d99d38ae3aa6e5d12bb260c9faa81e7448f842f0328f17ce4350425"} Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.034905 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pdnsv" event={"ID":"40de7d67-3283-492a-bfc8-65c83c19421f","Type":"ContainerStarted","Data":"2e114a7df4f8ca6dcc12c02a3346243b3282714593618a14003aa3cc64e007a7"} Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.036026 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vpxwc" event={"ID":"936c81dd-638d-4776-86fe-54a8aa53e50e","Type":"ContainerStarted","Data":"a66f799d2dc18b24c2fdc49069ffea0ad7a3b04af0842ba43d4a8f2c85e9aa14"} Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.036055 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pdhbv"] Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.037913 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-5nrkx" event={"ID":"c0c1f48e-db09-49cd-82db-e687ea384d2c","Type":"ContainerStarted","Data":"e7692f8187ad5f2af0a03672d7c5bf51a42a0a746c55b8e4bb6922c55bca96b0"} Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.039064 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8s9fl" event={"ID":"0d562068-166d-42b5-9f5f-3e33460f5410","Type":"ContainerStarted","Data":"b24a5afe2cef38f07c879b86f29875880284d43d96e49e7522068324d00cc05b"} Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.042035 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vnvw7" event={"ID":"9cae35f2-fcf0-4014-9b5b-9887d416e8d3","Type":"ContainerStarted","Data":"b6c407d268222c6dab8c929e3b39f6c883db20758277461ca5ebab772f9fe318"} Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.042980 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jmqgl" event={"ID":"72b9524f-58b1-451e-91e9-f244f763165d","Type":"ContainerStarted","Data":"f25743cfe5e3cafbf8c5644d1c55a17ca0a464ec0d69f1d3bd1ed1e65fea647b"} Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.044024 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pnvpf" event={"ID":"3b5a60df-c3e9-4553-92d2-0c8d8752d007","Type":"ContainerStarted","Data":"81e51fffa679690f98c5aec9832eb0569dfe056ed726915c968f273c9a0af75c"} Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.045769 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrmzx" event={"ID":"004c9445-7b3b-4479-bf6e-e6d880e4c7bb","Type":"ContainerStarted","Data":"669804207e1ec76352a5affca2445e1821060af89b1e9c1b67bf41f81ac12332"} Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.047555 4774 generic.go:334] "Generic (PLEG): container finished" podID="352c3d22-5aaa-47c1-a712-9022736511f4" containerID="3b0bee3d527acc8382d3021646443170180fcb7e8bcc8dbd63d2cebb83bd8ad2" exitCode=0 Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.047701 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn" event={"ID":"352c3d22-5aaa-47c1-a712-9022736511f4","Type":"ContainerDied","Data":"3b0bee3d527acc8382d3021646443170180fcb7e8bcc8dbd63d2cebb83bd8ad2"} Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.052511 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vkxrc" event={"ID":"07e852a6-9dbf-4533-85ad-d64d008bf488","Type":"ContainerStarted","Data":"53e51092bf7286ba31dd859924364447e9de922f24ed8b4a85db02fa9cb8a884"} Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.053119 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-jh8hv" Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.054149 4774 patch_prober.go:28] interesting pod/downloads-7954f5f757-jh8hv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.054194 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jh8hv" podUID="40ce0e8a-b3ee-4b5a-a3ef-ee0e4a573496" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.34:8080/\": dial tcp 10.217.0.34:8080: connect: connection refused" Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.055576 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4rcc5"] Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.124860 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:29 crc kubenswrapper[4774]: E1003 14:45:29.125197 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:29.625171997 +0000 UTC m=+152.214375449 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.217431 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8vg7"] Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.227255 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:29 crc kubenswrapper[4774]: E1003 14:45:29.227648 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:29.727635521 +0000 UTC m=+152.316838973 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.328414 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:29 crc kubenswrapper[4774]: E1003 14:45:29.329019 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:29.829001582 +0000 UTC m=+152.418205034 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.369399 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-jh8hv" podStartSLOduration=124.369355414 podStartE2EDuration="2m4.369355414s" podCreationTimestamp="2025-10-03 14:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:29.35931568 +0000 UTC m=+151.948519152" watchObservedRunningTime="2025-10-03 14:45:29.369355414 +0000 UTC m=+151.958558866" Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.377745 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zh5qn"] Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.384706 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwml7"] Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.430976 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:29 crc kubenswrapper[4774]: E1003 14:45:29.431300 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:29.931286571 +0000 UTC m=+152.520490023 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.446512 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g6pd2"] Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.531998 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:29 crc kubenswrapper[4774]: E1003 14:45:29.532474 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:30.032453925 +0000 UTC m=+152.621657377 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.541305 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-hw8fn"] Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.637481 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:29 crc kubenswrapper[4774]: E1003 14:45:29.637851 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:30.13783798 +0000 UTC m=+152.727041422 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.743866 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:29 crc kubenswrapper[4774]: E1003 14:45:29.744623 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:30.244595459 +0000 UTC m=+152.833798911 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.744749 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:29 crc kubenswrapper[4774]: E1003 14:45:29.745914 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:30.24589311 +0000 UTC m=+152.835096612 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.845645 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:29 crc kubenswrapper[4774]: E1003 14:45:29.845748 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:30.345733782 +0000 UTC m=+152.934937234 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.845863 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:29 crc kubenswrapper[4774]: E1003 14:45:29.846216 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:30.346209877 +0000 UTC m=+152.935413329 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.886571 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jsbrr"] Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.894364 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-p27cs"] Oct 03 14:45:29 crc kubenswrapper[4774]: I1003 14:45:29.947432 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:29 crc kubenswrapper[4774]: E1003 14:45:29.947784 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:30.447759463 +0000 UTC m=+153.036962915 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:30 crc kubenswrapper[4774]: W1003 14:45:30.025900 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86a66a03_a1cd_4d49_84d5_392cfe63d1ae.slice/crio-d15289edf50718f88d1d60dd223b6a409243a8f416a0ca74ebedc6f6d298cbd2 WatchSource:0}: Error finding container d15289edf50718f88d1d60dd223b6a409243a8f416a0ca74ebedc6f6d298cbd2: Status 404 returned error can't find the container with id d15289edf50718f88d1d60dd223b6a409243a8f416a0ca74ebedc6f6d298cbd2 Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.048801 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:30 crc kubenswrapper[4774]: E1003 14:45:30.049237 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:30.549222847 +0000 UTC m=+153.138426299 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.060324 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s7km8"] Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.102999 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crfxb" event={"ID":"a5b48e24-98c3-4875-9e6f-9d2de0463048","Type":"ContainerStarted","Data":"8a7b5e9a1bb295460ef6689ce3a8508fe4c4b0d17e4e94004b67419d2234bff1"} Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.104523 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwml7" event={"ID":"48199fa0-dc2a-4df0-b3d9-dca85040f205","Type":"ContainerStarted","Data":"7068745befa1c5072f68297db088f4d0d1fccaa7cf101e317e2b6dbecca5c8d8"} Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.105956 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" event={"ID":"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b","Type":"ContainerStarted","Data":"f7a9328a8ba2525175dcab54bbd33023d074dbf8754e228f73ef1b5cf08d6f98"} Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.110438 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-7qr8w" event={"ID":"bd925ec4-97eb-4780-82a0-e7dc782542ce","Type":"ContainerStarted","Data":"aa8a90971ed18c83d7bfe61dd2d3f9feecdcdf26288318b8f278e76549172a46"} Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.112421 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-7qr8w" Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.113580 4774 patch_prober.go:28] interesting pod/console-operator-58897d9998-7qr8w container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.113622 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-7qr8w" podUID="bd925ec4-97eb-4780-82a0-e7dc782542ce" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.120990 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vnvw7" event={"ID":"9cae35f2-fcf0-4014-9b5b-9887d416e8d3","Type":"ContainerStarted","Data":"3f4a8a0cb217cd6b8c10bbac0087200b71b5c5ae918194f1419ee50edf7d34c2"} Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.129476 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gwcm9"] Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.132334 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-kcqsn"] Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.139002 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325045-8kfbc"] Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.141717 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pnvpf" event={"ID":"3b5a60df-c3e9-4553-92d2-0c8d8752d007","Type":"ContainerStarted","Data":"1f1a9d1fc67686f9f6c6c217d3dcd2bf6906ab2904c10be7962576ea5cd599fa"} Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.149450 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.149660 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.149709 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.149767 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.149847 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:45:30 crc kubenswrapper[4774]: E1003 14:45:30.150456 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:30.650427292 +0000 UTC m=+153.239630744 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.151129 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-wjcbg"] Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.152032 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pdnsv" event={"ID":"40de7d67-3283-492a-bfc8-65c83c19421f","Type":"ContainerStarted","Data":"12fc2c7ed84cb699026349a6b69ab8277cc8d528b7b7829d95f16c975e1a8226"} Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.153527 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.157383 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-7qr8w" podStartSLOduration=125.157359779 podStartE2EDuration="2m5.157359779s" podCreationTimestamp="2025-10-03 14:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:30.156160461 +0000 UTC m=+152.745363913" watchObservedRunningTime="2025-10-03 14:45:30.157359779 +0000 UTC m=+152.746563231" Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.181924 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-t8q7s" event={"ID":"c27f3211-f4ba-485a-9bf7-edf913003803","Type":"ContainerStarted","Data":"0f9a6e6750f0997d2bc81da8ba265b0e77137f7322ef3ed645f8fe13ec523d26"} Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.187143 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.187473 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.189065 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.204798 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4rcc5" event={"ID":"5a4b24b0-5a45-41de-a84b-f7e9d4e773f1","Type":"ContainerStarted","Data":"f6d695b012c22664884c9ed76bbdd1e785db3ff81bbba6c186161f0d8821e24a"} Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.209344 4774 generic.go:334] "Generic (PLEG): container finished" podID="98199f15-3582-4b64-a118-c97f9ddadd11" containerID="f0c3586a3f73d48bc65d8a0ab5098b9bf4354f6dae26a4613b3885ef0f1af02d" exitCode=0 Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.209593 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" event={"ID":"98199f15-3582-4b64-a118-c97f9ddadd11","Type":"ContainerDied","Data":"f0c3586a3f73d48bc65d8a0ab5098b9bf4354f6dae26a4613b3885ef0f1af02d"} Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.223617 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.223882 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vkxrc" event={"ID":"07e852a6-9dbf-4533-85ad-d64d008bf488","Type":"ContainerStarted","Data":"cd99bf2234004650ba3dcbb667d0b2d91e4fbab51e0d4b484a3ee6129f763e79"} Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.233291 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.241793 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.251747 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-hw8fn" event={"ID":"2ffc56e3-143d-4b1d-8ba3-1619951778e7","Type":"ContainerStarted","Data":"764456d881b967cc6f0cb91d8e21f2ccf308c34e59c0ca4f5ee6f0071d101102"} Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.251994 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:30 crc kubenswrapper[4774]: E1003 14:45:30.252486 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:30.752466763 +0000 UTC m=+153.341670255 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.280885 4774 generic.go:334] "Generic (PLEG): container finished" podID="0d562068-166d-42b5-9f5f-3e33460f5410" containerID="b24a5afe2cef38f07c879b86f29875880284d43d96e49e7522068324d00cc05b" exitCode=0 Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.280971 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8s9fl" event={"ID":"0d562068-166d-42b5-9f5f-3e33460f5410","Type":"ContainerDied","Data":"b24a5afe2cef38f07c879b86f29875880284d43d96e49e7522068324d00cc05b"} Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.287043 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pnvpf" podStartSLOduration=125.287015764 podStartE2EDuration="2m5.287015764s" podCreationTimestamp="2025-10-03 14:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:30.24662595 +0000 UTC m=+152.835829402" watchObservedRunningTime="2025-10-03 14:45:30.287015764 +0000 UTC m=+152.876219226" Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.293574 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zh5qn" event={"ID":"83f48fb3-47e0-4266-9424-b54b47551fce","Type":"ContainerStarted","Data":"28fbe3a2e42daf3f6acdabac4f661bcfdd5fe8970dac2a092d4046a252befc6e"} Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.297502 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jmqgl" event={"ID":"72b9524f-58b1-451e-91e9-f244f763165d","Type":"ContainerStarted","Data":"838ebcad11400cfb5592e059a40bfc68325323a989a01dac071aebfee0c1c930"} Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.345505 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-s946h" event={"ID":"67a98087-a2b3-457a-be93-8fd1203b825d","Type":"ContainerStarted","Data":"762c59494765e203c9830391fb0d6ac9ff76959fb23b19702fdb7b67fe1d3aef"} Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.345557 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-s946h" event={"ID":"67a98087-a2b3-457a-be93-8fd1203b825d","Type":"ContainerStarted","Data":"9b766505d7583c11277389a8c777dbfcd37e7569aec62f50fd1478b569a2aebe"} Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.348060 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8pqx5"] Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.358171 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dk4v5"] Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.359612 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-dz4j8"] Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.359643 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jtjpd"] Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.359959 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:30 crc kubenswrapper[4774]: E1003 14:45:30.360078 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:30.860059548 +0000 UTC m=+153.449263000 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.360697 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:30 crc kubenswrapper[4774]: E1003 14:45:30.362726 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:30.862712611 +0000 UTC m=+153.451916123 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.389872 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wq2sd"] Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.392406 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7nrwv" event={"ID":"b5e417ed-5b5e-405b-8b95-ed27ddaef9ee","Type":"ContainerStarted","Data":"35d47118f363198b11bcb61bf3a46606ff921aae9cf7a285f6e18ea0999c7793"} Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.399744 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g6pd2" event={"ID":"63d4244b-7a5e-4b4d-8fc0-52440b50b276","Type":"ContainerStarted","Data":"419555d5dce82a3842038c51e7b0bd198df187d49ba8fbae59291ad108ddb015"} Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.403280 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8vg7" event={"ID":"825e4f26-ffbe-478b-a09b-7ffc436d4b4d","Type":"ContainerStarted","Data":"000eecf61a83214c89bfe42fd6291e6529faf7c9bd0b325c0ef6c6823b306dfd"} Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.405235 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pdhbv" event={"ID":"9621ba12-6602-4afb-80bb-116e84daef13","Type":"ContainerStarted","Data":"d62008029577c0cde8a297cf305c796164e03215c96aa18372c161d28dd314b2"} Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.406966 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7qxcv"] Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.407937 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-vnvw7" podStartSLOduration=125.407921875 podStartE2EDuration="2m5.407921875s" podCreationTimestamp="2025-10-03 14:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:30.390428648 +0000 UTC m=+152.979632100" watchObservedRunningTime="2025-10-03 14:45:30.407921875 +0000 UTC m=+152.997125327" Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.419192 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l9c9f" event={"ID":"e9be6b1c-e3c3-470e-9387-e2a6abd005aa","Type":"ContainerStarted","Data":"e470a71e05fe76e81ec2ae5a3a0e2999882df9a4fc9911ccde7e75ed336b2c67"} Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.419247 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l9c9f" event={"ID":"e9be6b1c-e3c3-470e-9387-e2a6abd005aa","Type":"ContainerStarted","Data":"5abba32ac5c8c730541c5a7f610ed344683add6d918ba24daee3774b80c77f23"} Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.435010 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7z59k" event={"ID":"6c0571f5-aaf0-4189-a20c-66f5b173ea49","Type":"ContainerStarted","Data":"ea14c1de05b42fe99a21dc8e2dbffc1ad100c2109b3975090044b10c1c26f3e7"} Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.454722 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j7gl6" event={"ID":"5a48eac5-1dc7-4478-ac71-084ad6302324","Type":"ContainerStarted","Data":"1f51a9e9645cecaa1dded203e547079336f5c32904fa593f29debef63598cb9c"} Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.454780 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-vpxwc" Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.454798 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrmzx" Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.461451 4774 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-vpxwc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.461497 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-vpxwc" podUID="936c81dd-638d-4776-86fe-54a8aa53e50e" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.461981 4774 patch_prober.go:28] interesting pod/downloads-7954f5f757-jh8hv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.462061 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jh8hv" podUID="40ce0e8a-b3ee-4b5a-a3ef-ee0e4a573496" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.34:8080/\": dial tcp 10.217.0.34:8080: connect: connection refused" Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.462452 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:30 crc kubenswrapper[4774]: E1003 14:45:30.462678 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:30.962658987 +0000 UTC m=+153.551862439 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.463749 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:30 crc kubenswrapper[4774]: E1003 14:45:30.464366 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:30.96435293 +0000 UTC m=+153.553556432 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:30 crc kubenswrapper[4774]: W1003 14:45:30.471574 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4949a78_2546_4a0c_b280_871726da80af.slice/crio-af32afe9734ddb2ceda2c1df061b07237ddadbf1a26749724438e27784ef9cd1 WatchSource:0}: Error finding container af32afe9734ddb2ceda2c1df061b07237ddadbf1a26749724438e27784ef9cd1: Status 404 returned error can't find the container with id af32afe9734ddb2ceda2c1df061b07237ddadbf1a26749724438e27784ef9cd1 Oct 03 14:45:30 crc kubenswrapper[4774]: W1003 14:45:30.486709 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0b599ea_9f7f_483a_a10b_4a5eb0fedce6.slice/crio-7a825db2e6ae387e937efddc290e0371df1471df87bf2b5a96b1b74ee669629e WatchSource:0}: Error finding container 7a825db2e6ae387e937efddc290e0371df1471df87bf2b5a96b1b74ee669629e: Status 404 returned error can't find the container with id 7a825db2e6ae387e937efddc290e0371df1471df87bf2b5a96b1b74ee669629e Oct 03 14:45:30 crc kubenswrapper[4774]: W1003 14:45:30.497946 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod923b3b0f_6810_4504_9757_fe4761f6ed37.slice/crio-4f4e8f93e0f851ea4d6a2cac331d092993d988b18c5dd9d61d5ed1d32d8c1a14 WatchSource:0}: Error finding container 4f4e8f93e0f851ea4d6a2cac331d092993d988b18c5dd9d61d5ed1d32d8c1a14: Status 404 returned error can't find the container with id 4f4e8f93e0f851ea4d6a2cac331d092993d988b18c5dd9d61d5ed1d32d8c1a14 Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.551040 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gwjdm"] Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.564640 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:30 crc kubenswrapper[4774]: E1003 14:45:30.570562 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:31.070537931 +0000 UTC m=+153.659741383 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.659998 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-s946h" Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.671313 4774 patch_prober.go:28] interesting pod/router-default-5444994796-s946h container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.672455 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:30 crc kubenswrapper[4774]: E1003 14:45:30.672979 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:31.172968654 +0000 UTC m=+153.762172106 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.674357 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s946h" podUID="67a98087-a2b3-457a-be93-8fd1203b825d" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.772921 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:30 crc kubenswrapper[4774]: E1003 14:45:30.773206 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:31.273187709 +0000 UTC m=+153.862391161 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.847835 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrmzx" Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.862699 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j7gl6" podStartSLOduration=124.862677138 podStartE2EDuration="2m4.862677138s" podCreationTimestamp="2025-10-03 14:43:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:30.819353543 +0000 UTC m=+153.408556995" watchObservedRunningTime="2025-10-03 14:45:30.862677138 +0000 UTC m=+153.451880600" Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.881571 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:30 crc kubenswrapper[4774]: E1003 14:45:30.881890 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:31.381877118 +0000 UTC m=+153.971080570 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.897671 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-s946h" podStartSLOduration=125.897653351 podStartE2EDuration="2m5.897653351s" podCreationTimestamp="2025-10-03 14:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:30.864791474 +0000 UTC m=+153.453994926" watchObservedRunningTime="2025-10-03 14:45:30.897653351 +0000 UTC m=+153.486856803" Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.936224 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-5nrkx" podStartSLOduration=125.936204877 podStartE2EDuration="2m5.936204877s" podCreationTimestamp="2025-10-03 14:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:30.900319985 +0000 UTC m=+153.489523427" watchObservedRunningTime="2025-10-03 14:45:30.936204877 +0000 UTC m=+153.525408329" Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.939217 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jmqgl" podStartSLOduration=125.939203951 podStartE2EDuration="2m5.939203951s" podCreationTimestamp="2025-10-03 14:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:30.935943879 +0000 UTC m=+153.525147331" watchObservedRunningTime="2025-10-03 14:45:30.939203951 +0000 UTC m=+153.528407413" Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.984558 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:30 crc kubenswrapper[4774]: E1003 14:45:30.984851 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:31.484821318 +0000 UTC m=+154.074024770 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:30 crc kubenswrapper[4774]: I1003 14:45:30.985153 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:30 crc kubenswrapper[4774]: E1003 14:45:30.985443 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:31.485430977 +0000 UTC m=+154.074634429 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.089445 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:31 crc kubenswrapper[4774]: E1003 14:45:31.089623 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:31.589597285 +0000 UTC m=+154.178800737 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.089780 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:31 crc kubenswrapper[4774]: E1003 14:45:31.090092 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:31.59008576 +0000 UTC m=+154.179289212 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.117340 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-7nrwv" podStartSLOduration=125.117319452 podStartE2EDuration="2m5.117319452s" podCreationTimestamp="2025-10-03 14:43:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:31.108297539 +0000 UTC m=+153.697500991" watchObservedRunningTime="2025-10-03 14:45:31.117319452 +0000 UTC m=+153.706522904" Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.151939 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-vpxwc" podStartSLOduration=126.151922544 podStartE2EDuration="2m6.151922544s" podCreationTimestamp="2025-10-03 14:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:31.144865313 +0000 UTC m=+153.734068765" watchObservedRunningTime="2025-10-03 14:45:31.151922544 +0000 UTC m=+153.741125996" Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.192753 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:31 crc kubenswrapper[4774]: E1003 14:45:31.193143 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:31.693119442 +0000 UTC m=+154.282322904 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.194211 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:31 crc kubenswrapper[4774]: E1003 14:45:31.194875 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:31.694858657 +0000 UTC m=+154.284062129 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.222091 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrmzx" podStartSLOduration=125.222053757 podStartE2EDuration="2m5.222053757s" podCreationTimestamp="2025-10-03 14:43:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:31.191890554 +0000 UTC m=+153.781094006" watchObservedRunningTime="2025-10-03 14:45:31.222053757 +0000 UTC m=+153.811257209" Oct 03 14:45:31 crc kubenswrapper[4774]: W1003 14:45:31.229163 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-80d32e13701c6b1edb4b757dcba2058c418265f8f10c78386ebc6c861a1981b8 WatchSource:0}: Error finding container 80d32e13701c6b1edb4b757dcba2058c418265f8f10c78386ebc6c861a1981b8: Status 404 returned error can't find the container with id 80d32e13701c6b1edb4b757dcba2058c418265f8f10c78386ebc6c861a1981b8 Oct 03 14:45:31 crc kubenswrapper[4774]: W1003 14:45:31.284318 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-0c9c60fc4aee6f2e8bb36338a3aeabe96b905d0c98b10ba1c9f44dc07a9e1a12 WatchSource:0}: Error finding container 0c9c60fc4aee6f2e8bb36338a3aeabe96b905d0c98b10ba1c9f44dc07a9e1a12: Status 404 returned error can't find the container with id 0c9c60fc4aee6f2e8bb36338a3aeabe96b905d0c98b10ba1c9f44dc07a9e1a12 Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.295358 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:31 crc kubenswrapper[4774]: E1003 14:45:31.295773 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:31.795752682 +0000 UTC m=+154.384956144 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.397523 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:31 crc kubenswrapper[4774]: E1003 14:45:31.398341 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:31.89832174 +0000 UTC m=+154.487525192 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.498895 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:31 crc kubenswrapper[4774]: E1003 14:45:31.499411 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:31.999395981 +0000 UTC m=+154.588599433 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.529058 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g6pd2" event={"ID":"63d4244b-7a5e-4b4d-8fc0-52440b50b276","Type":"ContainerStarted","Data":"54c6a04fd764e1e227676b333ce685038c943184dc0f2219f90a4b8048fd2686"} Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.529785 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g6pd2" Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.532855 4774 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-g6pd2 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.532927 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g6pd2" podUID="63d4244b-7a5e-4b4d-8fc0-52440b50b276" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.545688 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" event={"ID":"98199f15-3582-4b64-a118-c97f9ddadd11","Type":"ContainerStarted","Data":"0eda49f2fdae6b5ff4d50c5dee0186122d785e3a9dd5ee0c253bdd1f7f82fca6"} Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.548441 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crfxb" event={"ID":"a5b48e24-98c3-4875-9e6f-9d2de0463048","Type":"ContainerStarted","Data":"e95e884c6d43f74c990ef8771402fa37b296917b0c7fc2d34190397bf5f834fc"} Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.551437 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pdhbv" event={"ID":"9621ba12-6602-4afb-80bb-116e84daef13","Type":"ContainerStarted","Data":"3f1e7f2fa0ef06df7a3388cc6814bd6a37f15ad3670ec683fe1d7ff2805d0e3c"} Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.563058 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-8kfbc" event={"ID":"951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9","Type":"ContainerStarted","Data":"1a02b436e27ea94f0d0c69bf07983b638385eb7dc87c0284146192a5e0740503"} Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.563110 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-8kfbc" event={"ID":"951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9","Type":"ContainerStarted","Data":"faa04790f5b4b84fe7a6c0618a0ea7f35d3e6ea0cbc5c77d7d346d3f3436b3ed"} Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.565276 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7z59k" event={"ID":"6c0571f5-aaf0-4189-a20c-66f5b173ea49","Type":"ContainerStarted","Data":"f39e7f6bf42a3ec34cb1e3ae81c651f49ab4368bd1451120658fa2eec5277424"} Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.584188 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g6pd2" podStartSLOduration=125.584167172 podStartE2EDuration="2m5.584167172s" podCreationTimestamp="2025-10-03 14:43:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:31.575101819 +0000 UTC m=+154.164305271" watchObservedRunningTime="2025-10-03 14:45:31.584167172 +0000 UTC m=+154.173370624" Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.585086 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"80d32e13701c6b1edb4b757dcba2058c418265f8f10c78386ebc6c861a1981b8"} Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.592628 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gwcm9" event={"ID":"15557932-56e7-4119-bdec-104aa40ae284","Type":"ContainerStarted","Data":"d49fea9b849d2043c29a4aac8e04018cc0827ae11bc067cffc763df54d877bdf"} Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.592957 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gwcm9" event={"ID":"15557932-56e7-4119-bdec-104aa40ae284","Type":"ContainerStarted","Data":"bf0701cba746ffadf41131de6e37ba57025481f3ad86080a333c0cdfd1ff4d39"} Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.600313 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.601755 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wq2sd" event={"ID":"923b3b0f-6810-4504-9757-fe4761f6ed37","Type":"ContainerStarted","Data":"4f4e8f93e0f851ea4d6a2cac331d092993d988b18c5dd9d61d5ed1d32d8c1a14"} Oct 03 14:45:31 crc kubenswrapper[4774]: E1003 14:45:31.601796 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:32.101780913 +0000 UTC m=+154.690984355 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.611114 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-crfxb" podStartSLOduration=126.611094765 podStartE2EDuration="2m6.611094765s" podCreationTimestamp="2025-10-03 14:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:31.610703692 +0000 UTC m=+154.199907144" watchObservedRunningTime="2025-10-03 14:45:31.611094765 +0000 UTC m=+154.200298217" Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.613208 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vkxrc" event={"ID":"07e852a6-9dbf-4533-85ad-d64d008bf488","Type":"ContainerStarted","Data":"bfb27a4a0b0d17356ca3e2b5bf166cf3e0854360616f69647fb56cd26d00e89a"} Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.620672 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7qxcv" event={"ID":"98cf7a02-ed29-4cd4-9f60-77659f186e4b","Type":"ContainerStarted","Data":"83ed6fe4317017937970676c99866f40240c5eb374e524572ec0ce779efb468a"} Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.649311 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" event={"ID":"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b","Type":"ContainerStarted","Data":"ea62533f404c55e689258825792b52a0d686c204a38958091b86cf6b6a3d50f9"} Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.650752 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.654876 4774 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-p27cs container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.20:6443/healthz\": dial tcp 10.217.0.20:6443: connect: connection refused" start-of-body= Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.654925 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" podUID="aeba6428-0be6-4f7e-96f8-5b23fb08dd3b" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.20:6443/healthz\": dial tcp 10.217.0.20:6443: connect: connection refused" Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.663681 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pdhbv" podStartSLOduration=125.663660138 podStartE2EDuration="2m5.663660138s" podCreationTimestamp="2025-10-03 14:43:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:31.655188144 +0000 UTC m=+154.244391606" watchObservedRunningTime="2025-10-03 14:45:31.663660138 +0000 UTC m=+154.252863590" Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.672044 4774 patch_prober.go:28] interesting pod/router-default-5444994796-s946h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 14:45:31 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Oct 03 14:45:31 crc kubenswrapper[4774]: [+]process-running ok Oct 03 14:45:31 crc kubenswrapper[4774]: healthz check failed Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.672113 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s946h" podUID="67a98087-a2b3-457a-be93-8fd1203b825d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.701265 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-8kfbc" podStartSLOduration=31.701251474 podStartE2EDuration="31.701251474s" podCreationTimestamp="2025-10-03 14:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:31.699719096 +0000 UTC m=+154.288922548" watchObservedRunningTime="2025-10-03 14:45:31.701251474 +0000 UTC m=+154.290454926" Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.702847 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwml7" event={"ID":"48199fa0-dc2a-4df0-b3d9-dca85040f205","Type":"ContainerStarted","Data":"ea0659096376a3512dbe2db6d6fb615abf72df4d6d6ddeb15a12e6e302e996d0"} Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.703299 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:31 crc kubenswrapper[4774]: E1003 14:45:31.706306 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:32.206287332 +0000 UTC m=+154.795490784 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.709176 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwml7" Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.711842 4774 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-jwml7 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.711889 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwml7" podUID="48199fa0-dc2a-4df0-b3d9-dca85040f205" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.712982 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7nrwv" event={"ID":"b5e417ed-5b5e-405b-8b95-ed27ddaef9ee","Type":"ContainerStarted","Data":"36f19a86dd074d26799a7962c841ebb8b161bb56d1dd40013111e176b257466c"} Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.725520 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7z59k" podStartSLOduration=126.725498112 podStartE2EDuration="2m6.725498112s" podCreationTimestamp="2025-10-03 14:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:31.725491482 +0000 UTC m=+154.314694934" watchObservedRunningTime="2025-10-03 14:45:31.725498112 +0000 UTC m=+154.314701564" Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.733129 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-hw8fn" event={"ID":"2ffc56e3-143d-4b1d-8ba3-1619951778e7","Type":"ContainerStarted","Data":"ddcdff18fb5c469c08149385aa88106c8c4537317f19261e4be5b504dec0bf04"} Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.736602 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jtjpd" event={"ID":"b35e64bf-83d7-47de-b783-5b602815d90a","Type":"ContainerStarted","Data":"8b0f9303a7f0c6b20c4f2095f879d8f7bdb1794d101b992b4dc0673a6b86a4ab"} Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.776729 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwml7" podStartSLOduration=125.776708824 podStartE2EDuration="2m5.776708824s" podCreationTimestamp="2025-10-03 14:43:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:31.776218119 +0000 UTC m=+154.365421571" watchObservedRunningTime="2025-10-03 14:45:31.776708824 +0000 UTC m=+154.365912276" Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.806025 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:31 crc kubenswrapper[4774]: E1003 14:45:31.809115 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:32.309102587 +0000 UTC m=+154.898306039 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.816685 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vkxrc" podStartSLOduration=125.816667464 podStartE2EDuration="2m5.816667464s" podCreationTimestamp="2025-10-03 14:43:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:31.812591936 +0000 UTC m=+154.401795398" watchObservedRunningTime="2025-10-03 14:45:31.816667464 +0000 UTC m=+154.405870916" Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.823803 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-dz4j8" event={"ID":"e0b599ea-9f7f-483a-a10b-4a5eb0fedce6","Type":"ContainerStarted","Data":"7a825db2e6ae387e937efddc290e0371df1471df87bf2b5a96b1b74ee669629e"} Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.920919 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8s9fl" Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.920962 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:31 crc kubenswrapper[4774]: E1003 14:45:31.921992 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:32.421975317 +0000 UTC m=+155.011178769 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.962715 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8pqx5" event={"ID":"a4949a78-2546-4a0c-b280-871726da80af","Type":"ContainerStarted","Data":"af32afe9734ddb2ceda2c1df061b07237ddadbf1a26749724438e27784ef9cd1"} Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.980305 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"07ed5036d4027d1f750fc47e82df337cf6a3e0b08c7c9761d72efaf25daac8bb"} Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.980947 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.982802 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-kcqsn" event={"ID":"88303d53-7218-4326-917e-c075e1ec1f12","Type":"ContainerStarted","Data":"0a34f66a80210e13a84127731f4942dce2328de7b1aab97cad471a721ba17eb5"} Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.982828 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-kcqsn" event={"ID":"88303d53-7218-4326-917e-c075e1ec1f12","Type":"ContainerStarted","Data":"aa2aa5fc33479ea38bf26653d3b4519c1ea949f5a73d059d44858e001f824bd1"} Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.986978 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-t8q7s" event={"ID":"c27f3211-f4ba-485a-9bf7-edf913003803","Type":"ContainerStarted","Data":"56a3770a611ad5a536c532fb6d8cdb36bfc727a5d6c4267022045ffa7c0f5500"} Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.991763 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" podStartSLOduration=126.991721659 podStartE2EDuration="2m6.991721659s" podCreationTimestamp="2025-10-03 14:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:31.882823553 +0000 UTC m=+154.472027005" watchObservedRunningTime="2025-10-03 14:45:31.991721659 +0000 UTC m=+154.580925111" Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.992051 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zh5qn" event={"ID":"83f48fb3-47e0-4266-9424-b54b47551fce","Type":"ContainerStarted","Data":"79e0220b0288ddebaa3eb38f46516e8c5db263eed670bc658c0805d8db7f6b36"} Oct 03 14:45:31 crc kubenswrapper[4774]: I1003 14:45:31.993774 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8s9fl" podStartSLOduration=126.993764923 podStartE2EDuration="2m6.993764923s" podCreationTimestamp="2025-10-03 14:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:31.988129296 +0000 UTC m=+154.577332748" watchObservedRunningTime="2025-10-03 14:45:31.993764923 +0000 UTC m=+154.582968375" Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.002323 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"0c9c60fc4aee6f2e8bb36338a3aeabe96b905d0c98b10ba1c9f44dc07a9e1a12"} Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.022562 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-wjcbg" event={"ID":"8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67","Type":"ContainerStarted","Data":"1a7f0fae38b0887a422c63a8bcfae1d0e83fae7cd735b521937ff7d90333860b"} Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.024241 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:32 crc kubenswrapper[4774]: E1003 14:45:32.024570 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:32.524557286 +0000 UTC m=+155.113760738 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.063829 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pdnsv" event={"ID":"40de7d67-3283-492a-bfc8-65c83c19421f","Type":"ContainerStarted","Data":"6c90a1b78b48960a4034d998a1400d46bf6c6f5a08d79467f8bb326880b11af4"} Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.073694 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-t8q7s" podStartSLOduration=7.073674682 podStartE2EDuration="7.073674682s" podCreationTimestamp="2025-10-03 14:45:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:32.042137275 +0000 UTC m=+154.631340727" watchObservedRunningTime="2025-10-03 14:45:32.073674682 +0000 UTC m=+154.662878134" Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.091731 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4rcc5" event={"ID":"5a4b24b0-5a45-41de-a84b-f7e9d4e773f1","Type":"ContainerStarted","Data":"d39423a39175cc8954ec0cd5498ab1b05556100eb88eece279e57e7fca2c208b"} Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.092625 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4rcc5" Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.103931 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8vg7" event={"ID":"825e4f26-ffbe-478b-a09b-7ffc436d4b4d","Type":"ContainerStarted","Data":"2e1dd4516eea11e934c9fafd0db95b1b071542a9ab4b4e77e2e17354d2f6380e"} Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.115848 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gwjdm" event={"ID":"cc2761ec-8878-46ce-aea2-234dd179fbe6","Type":"ContainerStarted","Data":"dcf8e5f96fdfcf2ce284aa131666ee47a11ed3655ecf41e71b16dd3a2f887556"} Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.126777 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-kcqsn" podStartSLOduration=126.126759292 podStartE2EDuration="2m6.126759292s" podCreationTimestamp="2025-10-03 14:43:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:32.125365788 +0000 UTC m=+154.714569250" watchObservedRunningTime="2025-10-03 14:45:32.126759292 +0000 UTC m=+154.715962754" Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.129184 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:32 crc kubenswrapper[4774]: E1003 14:45:32.131799 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:32.631778219 +0000 UTC m=+155.220981671 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.139957 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jsbrr" event={"ID":"86a66a03-a1cd-4d49-84d5-392cfe63d1ae","Type":"ContainerStarted","Data":"05135c05b6fcb08c3ca9ca8dec3190ac220d001ba2b0a2b0b413f2280d0d46e6"} Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.140008 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jsbrr" event={"ID":"86a66a03-a1cd-4d49-84d5-392cfe63d1ae","Type":"ContainerStarted","Data":"67006c18335fe84ffae30e9360f45ac405c339cef107a8da2f691501a0551371"} Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.140021 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jsbrr" event={"ID":"86a66a03-a1cd-4d49-84d5-392cfe63d1ae","Type":"ContainerStarted","Data":"d15289edf50718f88d1d60dd223b6a409243a8f416a0ca74ebedc6f6d298cbd2"} Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.148824 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jsbrr" Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.190828 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dk4v5" event={"ID":"087458d5-9967-4dda-8084-028ea5b92cd1","Type":"ContainerStarted","Data":"23be539179a81c377d5e1d5d878d9acdd329a320fe9d4a971f439ceff90717f3"} Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.232041 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:32 crc kubenswrapper[4774]: E1003 14:45:32.233263 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:32.733245902 +0000 UTC m=+155.322449354 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.242835 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zh5qn" podStartSLOduration=127.242814702 podStartE2EDuration="2m7.242814702s" podCreationTimestamp="2025-10-03 14:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:32.236573776 +0000 UTC m=+154.825777228" watchObservedRunningTime="2025-10-03 14:45:32.242814702 +0000 UTC m=+154.832018154" Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.277232 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l9c9f" event={"ID":"e9be6b1c-e3c3-470e-9387-e2a6abd005aa","Type":"ContainerStarted","Data":"4924fc2db1d545bdf5cf17f7b48e6ede4f168ef87152ac3caf9d65a7856a3aae"} Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.339404 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-wjcbg" podStartSLOduration=127.339366351 podStartE2EDuration="2m7.339366351s" podCreationTimestamp="2025-10-03 14:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:32.338803254 +0000 UTC m=+154.928006706" watchObservedRunningTime="2025-10-03 14:45:32.339366351 +0000 UTC m=+154.928569803" Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.339860 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s7km8" event={"ID":"f27658be-3b8d-4706-9e91-831e832c34a7","Type":"ContainerStarted","Data":"7490b6cd317b97eb52d6b7a7d96d60ef7d1189b7b586298662cad2501aeab8f1"} Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.354869 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:32 crc kubenswrapper[4774]: E1003 14:45:32.355597 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:32.855575448 +0000 UTC m=+155.444778900 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.363774 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-vpxwc" Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.457146 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:32 crc kubenswrapper[4774]: E1003 14:45:32.460878 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:32.960858301 +0000 UTC m=+155.550061823 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.463093 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4rcc5" podStartSLOduration=126.46307607 podStartE2EDuration="2m6.46307607s" podCreationTimestamp="2025-10-03 14:43:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:32.45571066 +0000 UTC m=+155.044914112" watchObservedRunningTime="2025-10-03 14:45:32.46307607 +0000 UTC m=+155.052279522" Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.531637 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8vg7" podStartSLOduration=127.531613994 podStartE2EDuration="2m7.531613994s" podCreationTimestamp="2025-10-03 14:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:32.511011199 +0000 UTC m=+155.100214661" watchObservedRunningTime="2025-10-03 14:45:32.531613994 +0000 UTC m=+155.120817446" Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.568344 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pdnsv" podStartSLOduration=126.568326352 podStartE2EDuration="2m6.568326352s" podCreationTimestamp="2025-10-03 14:43:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:32.56697105 +0000 UTC m=+155.156174512" watchObservedRunningTime="2025-10-03 14:45:32.568326352 +0000 UTC m=+155.157529804" Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.570448 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:32 crc kubenswrapper[4774]: E1003 14:45:32.571345 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:33.071326336 +0000 UTC m=+155.660529788 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.680151 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:32 crc kubenswrapper[4774]: E1003 14:45:32.680824 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:33.180809679 +0000 UTC m=+155.770013131 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.686593 4774 patch_prober.go:28] interesting pod/router-default-5444994796-s946h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 14:45:32 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Oct 03 14:45:32 crc kubenswrapper[4774]: [+]process-running ok Oct 03 14:45:32 crc kubenswrapper[4774]: healthz check failed Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.686658 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s946h" podUID="67a98087-a2b3-457a-be93-8fd1203b825d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.717545 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l9c9f" podStartSLOduration=127.717521097 podStartE2EDuration="2m7.717521097s" podCreationTimestamp="2025-10-03 14:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:32.686735074 +0000 UTC m=+155.275938556" watchObservedRunningTime="2025-10-03 14:45:32.717521097 +0000 UTC m=+155.306724569" Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.751692 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-7qr8w" Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.781344 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:32 crc kubenswrapper[4774]: E1003 14:45:32.781660 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:33.281646283 +0000 UTC m=+155.870849735 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.879649 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jsbrr" podStartSLOduration=126.879628517 podStartE2EDuration="2m6.879628517s" podCreationTimestamp="2025-10-03 14:43:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:32.879145542 +0000 UTC m=+155.468348994" watchObservedRunningTime="2025-10-03 14:45:32.879628517 +0000 UTC m=+155.468831969" Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.880404 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s7km8" podStartSLOduration=127.880397891 podStartE2EDuration="2m7.880397891s" podCreationTimestamp="2025-10-03 14:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:32.825717041 +0000 UTC m=+155.414920493" watchObservedRunningTime="2025-10-03 14:45:32.880397891 +0000 UTC m=+155.469601343" Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.888292 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:32 crc kubenswrapper[4774]: E1003 14:45:32.888663 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:33.388646269 +0000 UTC m=+155.977849721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.909596 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-dk4v5" podStartSLOduration=7.909576714 podStartE2EDuration="7.909576714s" podCreationTimestamp="2025-10-03 14:45:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:32.90721346 +0000 UTC m=+155.496416912" watchObservedRunningTime="2025-10-03 14:45:32.909576714 +0000 UTC m=+155.498780166" Oct 03 14:45:32 crc kubenswrapper[4774]: I1003 14:45:32.988919 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:32 crc kubenswrapper[4774]: E1003 14:45:32.989691 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:33.489664338 +0000 UTC m=+156.078867790 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.090181 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:33 crc kubenswrapper[4774]: E1003 14:45:33.090739 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:33.590724509 +0000 UTC m=+156.179927961 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.093136 4774 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-4rcc5 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.093201 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4rcc5" podUID="5a4b24b0-5a45-41de-a84b-f7e9d4e773f1" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.38:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.196972 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:33 crc kubenswrapper[4774]: E1003 14:45:33.197589 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:33.69755875 +0000 UTC m=+156.286762232 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.197820 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:33 crc kubenswrapper[4774]: E1003 14:45:33.198168 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:33.698160579 +0000 UTC m=+156.287364031 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.298747 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:33 crc kubenswrapper[4774]: E1003 14:45:33.298862 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:33.798838798 +0000 UTC m=+156.388042250 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.299047 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:33 crc kubenswrapper[4774]: E1003 14:45:33.299328 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:33.799317933 +0000 UTC m=+156.388521385 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.346009 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-hw8fn" event={"ID":"2ffc56e3-143d-4b1d-8ba3-1619951778e7","Type":"ContainerStarted","Data":"91a8b3cb7bf90cc201ba95b8b4cf7490d4fb10d40d1a60aab84065705c6d2e3a"} Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.347255 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s7km8" event={"ID":"f27658be-3b8d-4706-9e91-831e832c34a7","Type":"ContainerStarted","Data":"c27b68d85b85eff056ad70e2299724830a70868af0b035eaf00848732527cff7"} Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.348902 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn" event={"ID":"352c3d22-5aaa-47c1-a712-9022736511f4","Type":"ContainerStarted","Data":"2b7ea9ab7a44d9432083706a550aa6b640e94d12960ffd17256ac0f46ce5ee31"} Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.351594 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" event={"ID":"98199f15-3582-4b64-a118-c97f9ddadd11","Type":"ContainerStarted","Data":"8663f6ef627cfad21508a7294ed1e6ea54676d20fbc19fe537cc7bbd645bb699"} Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.352975 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-dz4j8" event={"ID":"e0b599ea-9f7f-483a-a10b-4a5eb0fedce6","Type":"ContainerStarted","Data":"dee2409b837ab08c6b75192f3825eee92a4c42eb3d8c6396d3b6afaa4a560f65"} Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.354501 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dk4v5" event={"ID":"087458d5-9967-4dda-8084-028ea5b92cd1","Type":"ContainerStarted","Data":"0edea9c217b5c241b15f6fd5579897ca1ee5829a9c835f7cd7e9d1d360687750"} Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.356084 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wq2sd" event={"ID":"923b3b0f-6810-4504-9757-fe4761f6ed37","Type":"ContainerStarted","Data":"c733ea4a70192141edf13896a9e05ea86a2e061234cf9e5126aa9ee056a811a5"} Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.356704 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-wq2sd" Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.358241 4774 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-wq2sd container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.358274 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-wq2sd" podUID="923b3b0f-6810-4504-9757-fe4761f6ed37" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.359281 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7qxcv" event={"ID":"98cf7a02-ed29-4cd4-9f60-77659f186e4b","Type":"ContainerStarted","Data":"e96cde9a28879e0b0233e7ab40d1ac568756ee89149b61d1b85ca2fc05b58915"} Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.364676 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gwjdm" event={"ID":"cc2761ec-8878-46ce-aea2-234dd179fbe6","Type":"ContainerStarted","Data":"e2e3d0a49c7f79c32166357c7e547e8b79f35f3c2ac79b027a3d471aa0fed5ab"} Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.364725 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gwjdm" event={"ID":"cc2761ec-8878-46ce-aea2-234dd179fbe6","Type":"ContainerStarted","Data":"ae66ce58f622a5f5761da6919ea223ceee17e822f50eacbc364866e191468101"} Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.366018 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"dabed47ad20e7d3419802630b6f1d708dfe7d4750258a8ea4462f895124eebae"} Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.367442 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-wjcbg" event={"ID":"8e70d4c5-0a0f-4eff-b4cf-0fdcb3fcfc67","Type":"ContainerStarted","Data":"2de6da4b6d3f57c2ceca798d37d64ee1520a18c399233a8054c872e4d53ab2a5"} Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.368984 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gwcm9" event={"ID":"15557932-56e7-4119-bdec-104aa40ae284","Type":"ContainerStarted","Data":"64a576ff674fcacaa4e6f5dc9aa0280bb8b20a47b6ab8484262101eb947825e6"} Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.370624 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4ba4a839321b406e1ecc7d9adf634e33f3eb208f5a3efeb1311e681c07170f10"} Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.372578 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jtjpd" event={"ID":"b35e64bf-83d7-47de-b783-5b602815d90a","Type":"ContainerStarted","Data":"4cc717560149f7764b0c21f4e5ae0cb5c39306f20ab859cdfc04a972266d30f3"} Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.373754 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a5b059f49dcf8bc4e174570aedbd39c631a8c2b59bed892fc83572220d7b42c5"} Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.375602 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8s9fl" event={"ID":"0d562068-166d-42b5-9f5f-3e33460f5410","Type":"ContainerStarted","Data":"4568b6c53fc31f0bcb08c30a70db7ee437c52af7b3b2c0e9f7aa09ec3e08fed2"} Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.384637 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8pqx5" event={"ID":"a4949a78-2546-4a0c-b280-871726da80af","Type":"ContainerStarted","Data":"c78bd13be90e58d64a11890864708216fe47a98e2a3ff9ce98554c2f23aa9819"} Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.384681 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-8pqx5" Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.384692 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8pqx5" event={"ID":"a4949a78-2546-4a0c-b280-871726da80af","Type":"ContainerStarted","Data":"c30f9a406eb8ff107bfcd648a668a99100e5e1175a0e4dd7dc4e427eccb36a65"} Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.397926 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g6pd2" Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.399867 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:33 crc kubenswrapper[4774]: E1003 14:45:33.401404 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:33.901346354 +0000 UTC m=+156.490549816 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.402746 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4rcc5" Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.408169 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwml7" Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.419744 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:33 crc kubenswrapper[4774]: E1003 14:45:33.439528 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:33.939513698 +0000 UTC m=+156.528717150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.449229 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn" podStartSLOduration=127.449210361 podStartE2EDuration="2m7.449210361s" podCreationTimestamp="2025-10-03 14:43:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:33.438785975 +0000 UTC m=+156.027989427" watchObservedRunningTime="2025-10-03 14:45:33.449210361 +0000 UTC m=+156.038413813" Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.455796 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-hw8fn" podStartSLOduration=128.455780866 podStartE2EDuration="2m8.455780866s" podCreationTimestamp="2025-10-03 14:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:33.379981366 +0000 UTC m=+155.969184818" watchObservedRunningTime="2025-10-03 14:45:33.455780866 +0000 UTC m=+156.044984318" Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.486512 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-wq2sd" podStartSLOduration=127.486495967 podStartE2EDuration="2m7.486495967s" podCreationTimestamp="2025-10-03 14:43:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:33.486128836 +0000 UTC m=+156.075332288" watchObservedRunningTime="2025-10-03 14:45:33.486495967 +0000 UTC m=+156.075699419" Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.521867 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:33 crc kubenswrapper[4774]: E1003 14:45:33.522066 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:34.022039069 +0000 UTC m=+156.611242521 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.522315 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:33 crc kubenswrapper[4774]: E1003 14:45:33.524178 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:34.024139884 +0000 UTC m=+156.613343336 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.551326 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gwjdm" podStartSLOduration=127.551306044 podStartE2EDuration="2m7.551306044s" podCreationTimestamp="2025-10-03 14:43:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:33.5281438 +0000 UTC m=+156.117347262" watchObservedRunningTime="2025-10-03 14:45:33.551306044 +0000 UTC m=+156.140509496" Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.576970 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7qxcv" podStartSLOduration=127.576949236 podStartE2EDuration="2m7.576949236s" podCreationTimestamp="2025-10-03 14:43:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:33.554317758 +0000 UTC m=+156.143521210" watchObservedRunningTime="2025-10-03 14:45:33.576949236 +0000 UTC m=+156.166152698" Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.578635 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-dz4j8" podStartSLOduration=127.578629009 podStartE2EDuration="2m7.578629009s" podCreationTimestamp="2025-10-03 14:43:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:33.574561751 +0000 UTC m=+156.163765213" watchObservedRunningTime="2025-10-03 14:45:33.578629009 +0000 UTC m=+156.167832471" Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.625156 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:33 crc kubenswrapper[4774]: E1003 14:45:33.625572 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:34.125550406 +0000 UTC m=+156.714753868 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.664072 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-gwcm9" podStartSLOduration=127.66405746 podStartE2EDuration="2m7.66405746s" podCreationTimestamp="2025-10-03 14:43:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:33.66118248 +0000 UTC m=+156.250385932" watchObservedRunningTime="2025-10-03 14:45:33.66405746 +0000 UTC m=+156.253260912" Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.665685 4774 patch_prober.go:28] interesting pod/router-default-5444994796-s946h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 14:45:33 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Oct 03 14:45:33 crc kubenswrapper[4774]: [+]process-running ok Oct 03 14:45:33 crc kubenswrapper[4774]: healthz check failed Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.665773 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s946h" podUID="67a98087-a2b3-457a-be93-8fd1203b825d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.726995 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:33 crc kubenswrapper[4774]: E1003 14:45:33.727350 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:34.227335579 +0000 UTC m=+156.816539041 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.757069 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" podStartSLOduration=128.757046579 podStartE2EDuration="2m8.757046579s" podCreationTimestamp="2025-10-03 14:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:33.75229377 +0000 UTC m=+156.341497232" watchObservedRunningTime="2025-10-03 14:45:33.757046579 +0000 UTC m=+156.346250031" Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.828153 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:33 crc kubenswrapper[4774]: E1003 14:45:33.828318 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:34.328293037 +0000 UTC m=+156.917496489 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.828393 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:33 crc kubenswrapper[4774]: E1003 14:45:33.828684 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:34.328676839 +0000 UTC m=+156.917880291 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.863078 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-8pqx5" podStartSLOduration=8.863060674 podStartE2EDuration="8.863060674s" podCreationTimestamp="2025-10-03 14:45:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:33.84887587 +0000 UTC m=+156.438079342" watchObservedRunningTime="2025-10-03 14:45:33.863060674 +0000 UTC m=+156.452264126" Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.864309 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lvvp2"] Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.865256 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lvvp2" Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.871521 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.927763 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lvvp2"] Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.929697 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.929884 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dswf2\" (UniqueName: \"kubernetes.io/projected/e4a6d0e1-38db-42e1-8e29-b52de5bbad1d-kube-api-access-dswf2\") pod \"certified-operators-lvvp2\" (UID: \"e4a6d0e1-38db-42e1-8e29-b52de5bbad1d\") " pod="openshift-marketplace/certified-operators-lvvp2" Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.929939 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4a6d0e1-38db-42e1-8e29-b52de5bbad1d-utilities\") pod \"certified-operators-lvvp2\" (UID: \"e4a6d0e1-38db-42e1-8e29-b52de5bbad1d\") " pod="openshift-marketplace/certified-operators-lvvp2" Oct 03 14:45:33 crc kubenswrapper[4774]: I1003 14:45:33.929976 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4a6d0e1-38db-42e1-8e29-b52de5bbad1d-catalog-content\") pod \"certified-operators-lvvp2\" (UID: \"e4a6d0e1-38db-42e1-8e29-b52de5bbad1d\") " pod="openshift-marketplace/certified-operators-lvvp2" Oct 03 14:45:33 crc kubenswrapper[4774]: E1003 14:45:33.930062 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:34.430048709 +0000 UTC m=+157.019252161 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.031473 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4a6d0e1-38db-42e1-8e29-b52de5bbad1d-utilities\") pod \"certified-operators-lvvp2\" (UID: \"e4a6d0e1-38db-42e1-8e29-b52de5bbad1d\") " pod="openshift-marketplace/certified-operators-lvvp2" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.031768 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4a6d0e1-38db-42e1-8e29-b52de5bbad1d-catalog-content\") pod \"certified-operators-lvvp2\" (UID: \"e4a6d0e1-38db-42e1-8e29-b52de5bbad1d\") " pod="openshift-marketplace/certified-operators-lvvp2" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.031802 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.031829 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dswf2\" (UniqueName: \"kubernetes.io/projected/e4a6d0e1-38db-42e1-8e29-b52de5bbad1d-kube-api-access-dswf2\") pod \"certified-operators-lvvp2\" (UID: \"e4a6d0e1-38db-42e1-8e29-b52de5bbad1d\") " pod="openshift-marketplace/certified-operators-lvvp2" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.032464 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4a6d0e1-38db-42e1-8e29-b52de5bbad1d-utilities\") pod \"certified-operators-lvvp2\" (UID: \"e4a6d0e1-38db-42e1-8e29-b52de5bbad1d\") " pod="openshift-marketplace/certified-operators-lvvp2" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.032689 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4a6d0e1-38db-42e1-8e29-b52de5bbad1d-catalog-content\") pod \"certified-operators-lvvp2\" (UID: \"e4a6d0e1-38db-42e1-8e29-b52de5bbad1d\") " pod="openshift-marketplace/certified-operators-lvvp2" Oct 03 14:45:34 crc kubenswrapper[4774]: E1003 14:45:34.032918 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:34.532907766 +0000 UTC m=+157.122111218 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.104895 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2bx5m"] Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.105843 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2bx5m" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.111285 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.132332 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.132508 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/062191e1-9f34-4dba-bd1c-9bffe53f5cfd-catalog-content\") pod \"community-operators-2bx5m\" (UID: \"062191e1-9f34-4dba-bd1c-9bffe53f5cfd\") " pod="openshift-marketplace/community-operators-2bx5m" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.132537 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndlls\" (UniqueName: \"kubernetes.io/projected/062191e1-9f34-4dba-bd1c-9bffe53f5cfd-kube-api-access-ndlls\") pod \"community-operators-2bx5m\" (UID: \"062191e1-9f34-4dba-bd1c-9bffe53f5cfd\") " pod="openshift-marketplace/community-operators-2bx5m" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.132620 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/062191e1-9f34-4dba-bd1c-9bffe53f5cfd-utilities\") pod \"community-operators-2bx5m\" (UID: \"062191e1-9f34-4dba-bd1c-9bffe53f5cfd\") " pod="openshift-marketplace/community-operators-2bx5m" Oct 03 14:45:34 crc kubenswrapper[4774]: E1003 14:45:34.132760 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:34.632744589 +0000 UTC m=+157.221948041 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.164820 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2bx5m"] Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.202248 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dswf2\" (UniqueName: \"kubernetes.io/projected/e4a6d0e1-38db-42e1-8e29-b52de5bbad1d-kube-api-access-dswf2\") pod \"certified-operators-lvvp2\" (UID: \"e4a6d0e1-38db-42e1-8e29-b52de5bbad1d\") " pod="openshift-marketplace/certified-operators-lvvp2" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.235073 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.235107 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/062191e1-9f34-4dba-bd1c-9bffe53f5cfd-utilities\") pod \"community-operators-2bx5m\" (UID: \"062191e1-9f34-4dba-bd1c-9bffe53f5cfd\") " pod="openshift-marketplace/community-operators-2bx5m" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.235154 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/062191e1-9f34-4dba-bd1c-9bffe53f5cfd-catalog-content\") pod \"community-operators-2bx5m\" (UID: \"062191e1-9f34-4dba-bd1c-9bffe53f5cfd\") " pod="openshift-marketplace/community-operators-2bx5m" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.235175 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndlls\" (UniqueName: \"kubernetes.io/projected/062191e1-9f34-4dba-bd1c-9bffe53f5cfd-kube-api-access-ndlls\") pod \"community-operators-2bx5m\" (UID: \"062191e1-9f34-4dba-bd1c-9bffe53f5cfd\") " pod="openshift-marketplace/community-operators-2bx5m" Oct 03 14:45:34 crc kubenswrapper[4774]: E1003 14:45:34.235671 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:34.735660337 +0000 UTC m=+157.324863779 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.236128 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/062191e1-9f34-4dba-bd1c-9bffe53f5cfd-utilities\") pod \"community-operators-2bx5m\" (UID: \"062191e1-9f34-4dba-bd1c-9bffe53f5cfd\") " pod="openshift-marketplace/community-operators-2bx5m" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.236351 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/062191e1-9f34-4dba-bd1c-9bffe53f5cfd-catalog-content\") pod \"community-operators-2bx5m\" (UID: \"062191e1-9f34-4dba-bd1c-9bffe53f5cfd\") " pod="openshift-marketplace/community-operators-2bx5m" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.267358 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wbncg"] Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.268240 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wbncg" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.299131 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndlls\" (UniqueName: \"kubernetes.io/projected/062191e1-9f34-4dba-bd1c-9bffe53f5cfd-kube-api-access-ndlls\") pod \"community-operators-2bx5m\" (UID: \"062191e1-9f34-4dba-bd1c-9bffe53f5cfd\") " pod="openshift-marketplace/community-operators-2bx5m" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.337882 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.338151 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78rsr\" (UniqueName: \"kubernetes.io/projected/8656ee03-e27c-4ba6-a803-ab372ecb9b7b-kube-api-access-78rsr\") pod \"certified-operators-wbncg\" (UID: \"8656ee03-e27c-4ba6-a803-ab372ecb9b7b\") " pod="openshift-marketplace/certified-operators-wbncg" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.338188 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8656ee03-e27c-4ba6-a803-ab372ecb9b7b-utilities\") pod \"certified-operators-wbncg\" (UID: \"8656ee03-e27c-4ba6-a803-ab372ecb9b7b\") " pod="openshift-marketplace/certified-operators-wbncg" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.338217 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8656ee03-e27c-4ba6-a803-ab372ecb9b7b-catalog-content\") pod \"certified-operators-wbncg\" (UID: \"8656ee03-e27c-4ba6-a803-ab372ecb9b7b\") " pod="openshift-marketplace/certified-operators-wbncg" Oct 03 14:45:34 crc kubenswrapper[4774]: E1003 14:45:34.338332 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:34.838314028 +0000 UTC m=+157.427517480 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.382984 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wbncg"] Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.394070 4774 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-p27cs container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.20:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.394134 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" podUID="aeba6428-0be6-4f7e-96f8-5b23fb08dd3b" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.20:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.432051 4774 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-wq2sd container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.432103 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-wq2sd" podUID="923b3b0f-6810-4504-9757-fe4761f6ed37" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.433094 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2bx5m" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.453560 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78rsr\" (UniqueName: \"kubernetes.io/projected/8656ee03-e27c-4ba6-a803-ab372ecb9b7b-kube-api-access-78rsr\") pod \"certified-operators-wbncg\" (UID: \"8656ee03-e27c-4ba6-a803-ab372ecb9b7b\") " pod="openshift-marketplace/certified-operators-wbncg" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.453818 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8656ee03-e27c-4ba6-a803-ab372ecb9b7b-utilities\") pod \"certified-operators-wbncg\" (UID: \"8656ee03-e27c-4ba6-a803-ab372ecb9b7b\") " pod="openshift-marketplace/certified-operators-wbncg" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.453921 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.454041 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8656ee03-e27c-4ba6-a803-ab372ecb9b7b-catalog-content\") pod \"certified-operators-wbncg\" (UID: \"8656ee03-e27c-4ba6-a803-ab372ecb9b7b\") " pod="openshift-marketplace/certified-operators-wbncg" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.465973 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8656ee03-e27c-4ba6-a803-ab372ecb9b7b-utilities\") pod \"certified-operators-wbncg\" (UID: \"8656ee03-e27c-4ba6-a803-ab372ecb9b7b\") " pod="openshift-marketplace/certified-operators-wbncg" Oct 03 14:45:34 crc kubenswrapper[4774]: E1003 14:45:34.466354 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:34.966336752 +0000 UTC m=+157.555540204 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.466545 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8656ee03-e27c-4ba6-a803-ab372ecb9b7b-catalog-content\") pod \"certified-operators-wbncg\" (UID: \"8656ee03-e27c-4ba6-a803-ab372ecb9b7b\") " pod="openshift-marketplace/certified-operators-wbncg" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.474380 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6p5qb"] Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.475813 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6p5qb" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.485135 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lvvp2" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.500040 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6p5qb"] Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.546278 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78rsr\" (UniqueName: \"kubernetes.io/projected/8656ee03-e27c-4ba6-a803-ab372ecb9b7b-kube-api-access-78rsr\") pod \"certified-operators-wbncg\" (UID: \"8656ee03-e27c-4ba6-a803-ab372ecb9b7b\") " pod="openshift-marketplace/certified-operators-wbncg" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.555586 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.556162 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkmzf\" (UniqueName: \"kubernetes.io/projected/082e67c6-a87d-48ea-90e3-e20178613597-kube-api-access-zkmzf\") pod \"community-operators-6p5qb\" (UID: \"082e67c6-a87d-48ea-90e3-e20178613597\") " pod="openshift-marketplace/community-operators-6p5qb" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.556294 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/082e67c6-a87d-48ea-90e3-e20178613597-utilities\") pod \"community-operators-6p5qb\" (UID: \"082e67c6-a87d-48ea-90e3-e20178613597\") " pod="openshift-marketplace/community-operators-6p5qb" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.556409 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/082e67c6-a87d-48ea-90e3-e20178613597-catalog-content\") pod \"community-operators-6p5qb\" (UID: \"082e67c6-a87d-48ea-90e3-e20178613597\") " pod="openshift-marketplace/community-operators-6p5qb" Oct 03 14:45:34 crc kubenswrapper[4774]: E1003 14:45:34.556510 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:35.056497532 +0000 UTC m=+157.645700984 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.569703 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.605511 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wbncg" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.659179 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/082e67c6-a87d-48ea-90e3-e20178613597-utilities\") pod \"community-operators-6p5qb\" (UID: \"082e67c6-a87d-48ea-90e3-e20178613597\") " pod="openshift-marketplace/community-operators-6p5qb" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.659255 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.659278 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/082e67c6-a87d-48ea-90e3-e20178613597-catalog-content\") pod \"community-operators-6p5qb\" (UID: \"082e67c6-a87d-48ea-90e3-e20178613597\") " pod="openshift-marketplace/community-operators-6p5qb" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.659342 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkmzf\" (UniqueName: \"kubernetes.io/projected/082e67c6-a87d-48ea-90e3-e20178613597-kube-api-access-zkmzf\") pod \"community-operators-6p5qb\" (UID: \"082e67c6-a87d-48ea-90e3-e20178613597\") " pod="openshift-marketplace/community-operators-6p5qb" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.660061 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/082e67c6-a87d-48ea-90e3-e20178613597-utilities\") pod \"community-operators-6p5qb\" (UID: \"082e67c6-a87d-48ea-90e3-e20178613597\") " pod="openshift-marketplace/community-operators-6p5qb" Oct 03 14:45:34 crc kubenswrapper[4774]: E1003 14:45:34.660299 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:35.160288248 +0000 UTC m=+157.749491700 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.660650 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/082e67c6-a87d-48ea-90e3-e20178613597-catalog-content\") pod \"community-operators-6p5qb\" (UID: \"082e67c6-a87d-48ea-90e3-e20178613597\") " pod="openshift-marketplace/community-operators-6p5qb" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.682352 4774 patch_prober.go:28] interesting pod/router-default-5444994796-s946h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 14:45:34 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Oct 03 14:45:34 crc kubenswrapper[4774]: [+]process-running ok Oct 03 14:45:34 crc kubenswrapper[4774]: healthz check failed Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.682433 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s946h" podUID="67a98087-a2b3-457a-be93-8fd1203b825d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.758071 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkmzf\" (UniqueName: \"kubernetes.io/projected/082e67c6-a87d-48ea-90e3-e20178613597-kube-api-access-zkmzf\") pod \"community-operators-6p5qb\" (UID: \"082e67c6-a87d-48ea-90e3-e20178613597\") " pod="openshift-marketplace/community-operators-6p5qb" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.762397 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:34 crc kubenswrapper[4774]: E1003 14:45:34.762768 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:35.262752692 +0000 UTC m=+157.851956144 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.868114 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:34 crc kubenswrapper[4774]: E1003 14:45:34.868604 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:35.368589212 +0000 UTC m=+157.957792674 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.908762 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6p5qb" Oct 03 14:45:34 crc kubenswrapper[4774]: I1003 14:45:34.973192 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:34 crc kubenswrapper[4774]: E1003 14:45:34.973614 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:35.473593646 +0000 UTC m=+158.062797098 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:35 crc kubenswrapper[4774]: I1003 14:45:35.075113 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:35 crc kubenswrapper[4774]: E1003 14:45:35.075441 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:35.575427331 +0000 UTC m=+158.164630783 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:35 crc kubenswrapper[4774]: I1003 14:45:35.167111 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lvvp2"] Oct 03 14:45:35 crc kubenswrapper[4774]: I1003 14:45:35.175970 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:35 crc kubenswrapper[4774]: E1003 14:45:35.176419 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:35.676351068 +0000 UTC m=+158.265554520 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:35 crc kubenswrapper[4774]: I1003 14:45:35.241542 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8s9fl" Oct 03 14:45:35 crc kubenswrapper[4774]: I1003 14:45:35.277641 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:35 crc kubenswrapper[4774]: E1003 14:45:35.277930 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:35.777918314 +0000 UTC m=+158.367121766 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:35 crc kubenswrapper[4774]: I1003 14:45:35.314894 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2bx5m"] Oct 03 14:45:35 crc kubenswrapper[4774]: W1003 14:45:35.339273 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod062191e1_9f34_4dba_bd1c_9bffe53f5cfd.slice/crio-97220c6cc1c72285ae50a28f00349fdf7be9db99f94c663a054f070dfdc4ed87 WatchSource:0}: Error finding container 97220c6cc1c72285ae50a28f00349fdf7be9db99f94c663a054f070dfdc4ed87: Status 404 returned error can't find the container with id 97220c6cc1c72285ae50a28f00349fdf7be9db99f94c663a054f070dfdc4ed87 Oct 03 14:45:35 crc kubenswrapper[4774]: I1003 14:45:35.386952 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:35 crc kubenswrapper[4774]: E1003 14:45:35.387263 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:35.887246384 +0000 UTC m=+158.476449836 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:35 crc kubenswrapper[4774]: I1003 14:45:35.471784 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2bx5m" event={"ID":"062191e1-9f34-4dba-bd1c-9bffe53f5cfd","Type":"ContainerStarted","Data":"97220c6cc1c72285ae50a28f00349fdf7be9db99f94c663a054f070dfdc4ed87"} Oct 03 14:45:35 crc kubenswrapper[4774]: I1003 14:45:35.473120 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lvvp2" event={"ID":"e4a6d0e1-38db-42e1-8e29-b52de5bbad1d","Type":"ContainerStarted","Data":"021f06d1ec720728ea848cd7fb3d89d12fd0ebdb382e0340fcf99bab1045d1d3"} Oct 03 14:45:35 crc kubenswrapper[4774]: I1003 14:45:35.479403 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jtjpd" event={"ID":"b35e64bf-83d7-47de-b783-5b602815d90a","Type":"ContainerStarted","Data":"9ab38c52389dedd66e546bacdb31e7f1ad3dbbad14fca98f04536f3ea4ccbbdd"} Oct 03 14:45:35 crc kubenswrapper[4774]: I1003 14:45:35.480355 4774 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-wq2sd container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Oct 03 14:45:35 crc kubenswrapper[4774]: I1003 14:45:35.480403 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-wq2sd" podUID="923b3b0f-6810-4504-9757-fe4761f6ed37" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Oct 03 14:45:35 crc kubenswrapper[4774]: I1003 14:45:35.487472 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6p5qb"] Oct 03 14:45:35 crc kubenswrapper[4774]: I1003 14:45:35.487940 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:35 crc kubenswrapper[4774]: E1003 14:45:35.491992 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:35.991975479 +0000 UTC m=+158.581178931 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:35 crc kubenswrapper[4774]: W1003 14:45:35.514578 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod082e67c6_a87d_48ea_90e3_e20178613597.slice/crio-59a6617e964a4e8ba76b9abc3b1dba2231af5975de0a554348f1c9516fbe0ce2 WatchSource:0}: Error finding container 59a6617e964a4e8ba76b9abc3b1dba2231af5975de0a554348f1c9516fbe0ce2: Status 404 returned error can't find the container with id 59a6617e964a4e8ba76b9abc3b1dba2231af5975de0a554348f1c9516fbe0ce2 Oct 03 14:45:35 crc kubenswrapper[4774]: I1003 14:45:35.588845 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:35 crc kubenswrapper[4774]: E1003 14:45:35.589103 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:36.089089176 +0000 UTC m=+158.678292628 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:35 crc kubenswrapper[4774]: I1003 14:45:35.647008 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wbncg"] Oct 03 14:45:35 crc kubenswrapper[4774]: I1003 14:45:35.668839 4774 patch_prober.go:28] interesting pod/router-default-5444994796-s946h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 14:45:35 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Oct 03 14:45:35 crc kubenswrapper[4774]: [+]process-running ok Oct 03 14:45:35 crc kubenswrapper[4774]: healthz check failed Oct 03 14:45:35 crc kubenswrapper[4774]: I1003 14:45:35.668933 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s946h" podUID="67a98087-a2b3-457a-be93-8fd1203b825d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 14:45:35 crc kubenswrapper[4774]: I1003 14:45:35.690998 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:35 crc kubenswrapper[4774]: E1003 14:45:35.695509 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:36.195465993 +0000 UTC m=+158.784669445 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:35 crc kubenswrapper[4774]: W1003 14:45:35.759127 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8656ee03_e27c_4ba6_a803_ab372ecb9b7b.slice/crio-184ae1adfaf5cd185d9ee92ea6ccfc99fc82df45d4199b68d66b29d19493f3cf WatchSource:0}: Error finding container 184ae1adfaf5cd185d9ee92ea6ccfc99fc82df45d4199b68d66b29d19493f3cf: Status 404 returned error can't find the container with id 184ae1adfaf5cd185d9ee92ea6ccfc99fc82df45d4199b68d66b29d19493f3cf Oct 03 14:45:35 crc kubenswrapper[4774]: I1003 14:45:35.794808 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:35 crc kubenswrapper[4774]: E1003 14:45:35.795195 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:36.295179282 +0000 UTC m=+158.884382734 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:35 crc kubenswrapper[4774]: I1003 14:45:35.896185 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:35 crc kubenswrapper[4774]: E1003 14:45:35.896731 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:36.396719327 +0000 UTC m=+158.985922779 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:35 crc kubenswrapper[4774]: I1003 14:45:35.997528 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:35 crc kubenswrapper[4774]: E1003 14:45:35.997841 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:36.49782543 +0000 UTC m=+159.087028882 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.098815 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:36 crc kubenswrapper[4774]: E1003 14:45:36.099105 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:36.599093777 +0000 UTC m=+159.188297229 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.199786 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:36 crc kubenswrapper[4774]: E1003 14:45:36.200353 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:36.700337982 +0000 UTC m=+159.289541434 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.233538 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hntbw"] Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.234698 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hntbw" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.237088 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.259938 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hntbw"] Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.301240 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j9pm\" (UniqueName: \"kubernetes.io/projected/8922cd85-deea-4326-88fc-68c67debf56c-kube-api-access-9j9pm\") pod \"redhat-marketplace-hntbw\" (UID: \"8922cd85-deea-4326-88fc-68c67debf56c\") " pod="openshift-marketplace/redhat-marketplace-hntbw" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.301295 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8922cd85-deea-4326-88fc-68c67debf56c-catalog-content\") pod \"redhat-marketplace-hntbw\" (UID: \"8922cd85-deea-4326-88fc-68c67debf56c\") " pod="openshift-marketplace/redhat-marketplace-hntbw" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.301328 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8922cd85-deea-4326-88fc-68c67debf56c-utilities\") pod \"redhat-marketplace-hntbw\" (UID: \"8922cd85-deea-4326-88fc-68c67debf56c\") " pod="openshift-marketplace/redhat-marketplace-hntbw" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.301391 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:36 crc kubenswrapper[4774]: E1003 14:45:36.301657 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:36.801646591 +0000 UTC m=+159.390850043 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.402067 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.402338 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j9pm\" (UniqueName: \"kubernetes.io/projected/8922cd85-deea-4326-88fc-68c67debf56c-kube-api-access-9j9pm\") pod \"redhat-marketplace-hntbw\" (UID: \"8922cd85-deea-4326-88fc-68c67debf56c\") " pod="openshift-marketplace/redhat-marketplace-hntbw" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.402407 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8922cd85-deea-4326-88fc-68c67debf56c-catalog-content\") pod \"redhat-marketplace-hntbw\" (UID: \"8922cd85-deea-4326-88fc-68c67debf56c\") " pod="openshift-marketplace/redhat-marketplace-hntbw" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.402453 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8922cd85-deea-4326-88fc-68c67debf56c-utilities\") pod \"redhat-marketplace-hntbw\" (UID: \"8922cd85-deea-4326-88fc-68c67debf56c\") " pod="openshift-marketplace/redhat-marketplace-hntbw" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.402938 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8922cd85-deea-4326-88fc-68c67debf56c-utilities\") pod \"redhat-marketplace-hntbw\" (UID: \"8922cd85-deea-4326-88fc-68c67debf56c\") " pod="openshift-marketplace/redhat-marketplace-hntbw" Oct 03 14:45:36 crc kubenswrapper[4774]: E1003 14:45:36.403245 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:36.903226548 +0000 UTC m=+159.492430000 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.403863 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8922cd85-deea-4326-88fc-68c67debf56c-catalog-content\") pod \"redhat-marketplace-hntbw\" (UID: \"8922cd85-deea-4326-88fc-68c67debf56c\") " pod="openshift-marketplace/redhat-marketplace-hntbw" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.424306 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j9pm\" (UniqueName: \"kubernetes.io/projected/8922cd85-deea-4326-88fc-68c67debf56c-kube-api-access-9j9pm\") pod \"redhat-marketplace-hntbw\" (UID: \"8922cd85-deea-4326-88fc-68c67debf56c\") " pod="openshift-marketplace/redhat-marketplace-hntbw" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.450999 4774 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.485988 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jtjpd" event={"ID":"b35e64bf-83d7-47de-b783-5b602815d90a","Type":"ContainerStarted","Data":"f96bcf45356e27ca5725177910a84d3c58176e8dfe3155b167514be5d9b548a5"} Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.486033 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jtjpd" event={"ID":"b35e64bf-83d7-47de-b783-5b602815d90a","Type":"ContainerStarted","Data":"480adf8e751b9797072a5174c543c002da9f7e7fad921e2d9dc994dd0161b885"} Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.487218 4774 generic.go:334] "Generic (PLEG): container finished" podID="062191e1-9f34-4dba-bd1c-9bffe53f5cfd" containerID="0d45cb133189514f14e10e01b4d32fa37359727eb1ea83fa484e8e7e9ba3a9aa" exitCode=0 Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.487293 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2bx5m" event={"ID":"062191e1-9f34-4dba-bd1c-9bffe53f5cfd","Type":"ContainerDied","Data":"0d45cb133189514f14e10e01b4d32fa37359727eb1ea83fa484e8e7e9ba3a9aa"} Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.488760 4774 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.489525 4774 generic.go:334] "Generic (PLEG): container finished" podID="082e67c6-a87d-48ea-90e3-e20178613597" containerID="e67058ff9e92987338d048ace401510a23b337c397b1507ea97d5e35a091521c" exitCode=0 Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.489677 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6p5qb" event={"ID":"082e67c6-a87d-48ea-90e3-e20178613597","Type":"ContainerDied","Data":"e67058ff9e92987338d048ace401510a23b337c397b1507ea97d5e35a091521c"} Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.489744 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6p5qb" event={"ID":"082e67c6-a87d-48ea-90e3-e20178613597","Type":"ContainerStarted","Data":"59a6617e964a4e8ba76b9abc3b1dba2231af5975de0a554348f1c9516fbe0ce2"} Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.492938 4774 generic.go:334] "Generic (PLEG): container finished" podID="8656ee03-e27c-4ba6-a803-ab372ecb9b7b" containerID="1c61f1a30913bf289b784895fc7d6d1c332d634278a052bf41eb774ae6c8f60f" exitCode=0 Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.493015 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbncg" event={"ID":"8656ee03-e27c-4ba6-a803-ab372ecb9b7b","Type":"ContainerDied","Data":"1c61f1a30913bf289b784895fc7d6d1c332d634278a052bf41eb774ae6c8f60f"} Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.493048 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbncg" event={"ID":"8656ee03-e27c-4ba6-a803-ab372ecb9b7b","Type":"ContainerStarted","Data":"184ae1adfaf5cd185d9ee92ea6ccfc99fc82df45d4199b68d66b29d19493f3cf"} Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.494901 4774 generic.go:334] "Generic (PLEG): container finished" podID="e4a6d0e1-38db-42e1-8e29-b52de5bbad1d" containerID="0a540d82efa8c4b60873f06a9e31ed8a7fd13e0d204f34e06e5f399194d9c554" exitCode=0 Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.494926 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lvvp2" event={"ID":"e4a6d0e1-38db-42e1-8e29-b52de5bbad1d","Type":"ContainerDied","Data":"0a540d82efa8c4b60873f06a9e31ed8a7fd13e0d204f34e06e5f399194d9c554"} Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.505499 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-jtjpd" podStartSLOduration=11.505485216 podStartE2EDuration="11.505485216s" podCreationTimestamp="2025-10-03 14:45:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:36.502700179 +0000 UTC m=+159.091903631" watchObservedRunningTime="2025-10-03 14:45:36.505485216 +0000 UTC m=+159.094688668" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.506619 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:36 crc kubenswrapper[4774]: E1003 14:45:36.507324 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:37.007310233 +0000 UTC m=+159.596513685 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.549303 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hntbw" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.608134 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:36 crc kubenswrapper[4774]: E1003 14:45:36.609193 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:45:37.109176929 +0000 UTC m=+159.698380371 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.629831 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6szd8"] Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.630953 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6szd8" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.647284 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6szd8"] Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.662257 4774 patch_prober.go:28] interesting pod/router-default-5444994796-s946h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 14:45:36 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Oct 03 14:45:36 crc kubenswrapper[4774]: [+]process-running ok Oct 03 14:45:36 crc kubenswrapper[4774]: healthz check failed Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.662317 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s946h" podUID="67a98087-a2b3-457a-be93-8fd1203b825d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.702673 4774 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-03T14:45:36.451029663Z","Handler":null,"Name":""} Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.711302 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.711394 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dfe8164-582f-46df-a0a7-790c7df9e6f9-catalog-content\") pod \"redhat-marketplace-6szd8\" (UID: \"8dfe8164-582f-46df-a0a7-790c7df9e6f9\") " pod="openshift-marketplace/redhat-marketplace-6szd8" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.711425 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dfe8164-582f-46df-a0a7-790c7df9e6f9-utilities\") pod \"redhat-marketplace-6szd8\" (UID: \"8dfe8164-582f-46df-a0a7-790c7df9e6f9\") " pod="openshift-marketplace/redhat-marketplace-6szd8" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.711486 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wchl4\" (UniqueName: \"kubernetes.io/projected/8dfe8164-582f-46df-a0a7-790c7df9e6f9-kube-api-access-wchl4\") pod \"redhat-marketplace-6szd8\" (UID: \"8dfe8164-582f-46df-a0a7-790c7df9e6f9\") " pod="openshift-marketplace/redhat-marketplace-6szd8" Oct 03 14:45:36 crc kubenswrapper[4774]: E1003 14:45:36.711784 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:45:37.211772647 +0000 UTC m=+159.800976099 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sjxw" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.734046 4774 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.734100 4774 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.740448 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.741589 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.745220 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.748117 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.755232 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.779617 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hntbw"] Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.812902 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.813159 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/907dd334-f95b-417d-bbd7-402486f2fcff-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"907dd334-f95b-417d-bbd7-402486f2fcff\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.813236 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dfe8164-582f-46df-a0a7-790c7df9e6f9-catalog-content\") pod \"redhat-marketplace-6szd8\" (UID: \"8dfe8164-582f-46df-a0a7-790c7df9e6f9\") " pod="openshift-marketplace/redhat-marketplace-6szd8" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.813268 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dfe8164-582f-46df-a0a7-790c7df9e6f9-utilities\") pod \"redhat-marketplace-6szd8\" (UID: \"8dfe8164-582f-46df-a0a7-790c7df9e6f9\") " pod="openshift-marketplace/redhat-marketplace-6szd8" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.813325 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wchl4\" (UniqueName: \"kubernetes.io/projected/8dfe8164-582f-46df-a0a7-790c7df9e6f9-kube-api-access-wchl4\") pod \"redhat-marketplace-6szd8\" (UID: \"8dfe8164-582f-46df-a0a7-790c7df9e6f9\") " pod="openshift-marketplace/redhat-marketplace-6szd8" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.813358 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/907dd334-f95b-417d-bbd7-402486f2fcff-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"907dd334-f95b-417d-bbd7-402486f2fcff\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.813893 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dfe8164-582f-46df-a0a7-790c7df9e6f9-utilities\") pod \"redhat-marketplace-6szd8\" (UID: \"8dfe8164-582f-46df-a0a7-790c7df9e6f9\") " pod="openshift-marketplace/redhat-marketplace-6szd8" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.813935 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dfe8164-582f-46df-a0a7-790c7df9e6f9-catalog-content\") pod \"redhat-marketplace-6szd8\" (UID: \"8dfe8164-582f-46df-a0a7-790c7df9e6f9\") " pod="openshift-marketplace/redhat-marketplace-6szd8" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.817812 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.834949 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wchl4\" (UniqueName: \"kubernetes.io/projected/8dfe8164-582f-46df-a0a7-790c7df9e6f9-kube-api-access-wchl4\") pod \"redhat-marketplace-6szd8\" (UID: \"8dfe8164-582f-46df-a0a7-790c7df9e6f9\") " pod="openshift-marketplace/redhat-marketplace-6szd8" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.914767 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/907dd334-f95b-417d-bbd7-402486f2fcff-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"907dd334-f95b-417d-bbd7-402486f2fcff\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.914820 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.914841 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/907dd334-f95b-417d-bbd7-402486f2fcff-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"907dd334-f95b-417d-bbd7-402486f2fcff\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.914885 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/907dd334-f95b-417d-bbd7-402486f2fcff-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"907dd334-f95b-417d-bbd7-402486f2fcff\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.941481 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/907dd334-f95b-417d-bbd7-402486f2fcff-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"907dd334-f95b-417d-bbd7-402486f2fcff\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.959273 4774 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.959309 4774 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.978833 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6szd8" Oct 03 14:45:36 crc kubenswrapper[4774]: I1003 14:45:36.984338 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sjxw\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.033236 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mg7ls"] Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.034320 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mg7ls" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.036598 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.048490 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mg7ls"] Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.064064 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.121342 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aad74fd-c975-42c4-8b05-ed28dbd55205-catalog-content\") pod \"redhat-operators-mg7ls\" (UID: \"2aad74fd-c975-42c4-8b05-ed28dbd55205\") " pod="openshift-marketplace/redhat-operators-mg7ls" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.121774 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwp6h\" (UniqueName: \"kubernetes.io/projected/2aad74fd-c975-42c4-8b05-ed28dbd55205-kube-api-access-cwp6h\") pod \"redhat-operators-mg7ls\" (UID: \"2aad74fd-c975-42c4-8b05-ed28dbd55205\") " pod="openshift-marketplace/redhat-operators-mg7ls" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.121802 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aad74fd-c975-42c4-8b05-ed28dbd55205-utilities\") pod \"redhat-operators-mg7ls\" (UID: \"2aad74fd-c975-42c4-8b05-ed28dbd55205\") " pod="openshift-marketplace/redhat-operators-mg7ls" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.222946 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aad74fd-c975-42c4-8b05-ed28dbd55205-catalog-content\") pod \"redhat-operators-mg7ls\" (UID: \"2aad74fd-c975-42c4-8b05-ed28dbd55205\") " pod="openshift-marketplace/redhat-operators-mg7ls" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.223024 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwp6h\" (UniqueName: \"kubernetes.io/projected/2aad74fd-c975-42c4-8b05-ed28dbd55205-kube-api-access-cwp6h\") pod \"redhat-operators-mg7ls\" (UID: \"2aad74fd-c975-42c4-8b05-ed28dbd55205\") " pod="openshift-marketplace/redhat-operators-mg7ls" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.223067 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aad74fd-c975-42c4-8b05-ed28dbd55205-utilities\") pod \"redhat-operators-mg7ls\" (UID: \"2aad74fd-c975-42c4-8b05-ed28dbd55205\") " pod="openshift-marketplace/redhat-operators-mg7ls" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.223998 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aad74fd-c975-42c4-8b05-ed28dbd55205-utilities\") pod \"redhat-operators-mg7ls\" (UID: \"2aad74fd-c975-42c4-8b05-ed28dbd55205\") " pod="openshift-marketplace/redhat-operators-mg7ls" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.224090 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aad74fd-c975-42c4-8b05-ed28dbd55205-catalog-content\") pod \"redhat-operators-mg7ls\" (UID: \"2aad74fd-c975-42c4-8b05-ed28dbd55205\") " pod="openshift-marketplace/redhat-operators-mg7ls" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.232721 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kj6jv"] Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.234006 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kj6jv" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.243234 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.255087 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kj6jv"] Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.280722 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwp6h\" (UniqueName: \"kubernetes.io/projected/2aad74fd-c975-42c4-8b05-ed28dbd55205-kube-api-access-cwp6h\") pod \"redhat-operators-mg7ls\" (UID: \"2aad74fd-c975-42c4-8b05-ed28dbd55205\") " pod="openshift-marketplace/redhat-operators-mg7ls" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.308839 4774 patch_prober.go:28] interesting pod/downloads-7954f5f757-jh8hv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.308888 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jh8hv" podUID="40ce0e8a-b3ee-4b5a-a3ef-ee0e4a573496" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.34:8080/\": dial tcp 10.217.0.34:8080: connect: connection refused" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.309704 4774 patch_prober.go:28] interesting pod/downloads-7954f5f757-jh8hv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.34:8080/\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.309736 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-jh8hv" podUID="40ce0e8a-b3ee-4b5a-a3ef-ee0e4a573496" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.34:8080/\": dial tcp 10.217.0.34:8080: connect: connection refused" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.315680 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.332991 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6szd8"] Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.333594 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea488297-1ed1-475d-8c6a-9b4881dd5583-utilities\") pod \"redhat-operators-kj6jv\" (UID: \"ea488297-1ed1-475d-8c6a-9b4881dd5583\") " pod="openshift-marketplace/redhat-operators-kj6jv" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.333670 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea488297-1ed1-475d-8c6a-9b4881dd5583-catalog-content\") pod \"redhat-operators-kj6jv\" (UID: \"ea488297-1ed1-475d-8c6a-9b4881dd5583\") " pod="openshift-marketplace/redhat-operators-kj6jv" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.333690 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxtfx\" (UniqueName: \"kubernetes.io/projected/ea488297-1ed1-475d-8c6a-9b4881dd5583-kube-api-access-xxtfx\") pod \"redhat-operators-kj6jv\" (UID: \"ea488297-1ed1-475d-8c6a-9b4881dd5583\") " pod="openshift-marketplace/redhat-operators-kj6jv" Oct 03 14:45:37 crc kubenswrapper[4774]: W1003 14:45:37.346561 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8dfe8164_582f_46df_a0a7_790c7df9e6f9.slice/crio-410476bd19342eb970cc2a4d94063572cad85cea4ee0e7a326da4993b75a857e WatchSource:0}: Error finding container 410476bd19342eb970cc2a4d94063572cad85cea4ee0e7a326da4993b75a857e: Status 404 returned error can't find the container with id 410476bd19342eb970cc2a4d94063572cad85cea4ee0e7a326da4993b75a857e Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.348125 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mg7ls" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.368549 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.430335 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.430417 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.435700 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea488297-1ed1-475d-8c6a-9b4881dd5583-catalog-content\") pod \"redhat-operators-kj6jv\" (UID: \"ea488297-1ed1-475d-8c6a-9b4881dd5583\") " pod="openshift-marketplace/redhat-operators-kj6jv" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.435738 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxtfx\" (UniqueName: \"kubernetes.io/projected/ea488297-1ed1-475d-8c6a-9b4881dd5583-kube-api-access-xxtfx\") pod \"redhat-operators-kj6jv\" (UID: \"ea488297-1ed1-475d-8c6a-9b4881dd5583\") " pod="openshift-marketplace/redhat-operators-kj6jv" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.435813 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea488297-1ed1-475d-8c6a-9b4881dd5583-utilities\") pod \"redhat-operators-kj6jv\" (UID: \"ea488297-1ed1-475d-8c6a-9b4881dd5583\") " pod="openshift-marketplace/redhat-operators-kj6jv" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.436266 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea488297-1ed1-475d-8c6a-9b4881dd5583-catalog-content\") pod \"redhat-operators-kj6jv\" (UID: \"ea488297-1ed1-475d-8c6a-9b4881dd5583\") " pod="openshift-marketplace/redhat-operators-kj6jv" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.436522 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea488297-1ed1-475d-8c6a-9b4881dd5583-utilities\") pod \"redhat-operators-kj6jv\" (UID: \"ea488297-1ed1-475d-8c6a-9b4881dd5583\") " pod="openshift-marketplace/redhat-operators-kj6jv" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.437149 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.455687 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxtfx\" (UniqueName: \"kubernetes.io/projected/ea488297-1ed1-475d-8c6a-9b4881dd5583-kube-api-access-xxtfx\") pod \"redhat-operators-kj6jv\" (UID: \"ea488297-1ed1-475d-8c6a-9b4881dd5583\") " pod="openshift-marketplace/redhat-operators-kj6jv" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.530237 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"907dd334-f95b-417d-bbd7-402486f2fcff","Type":"ContainerStarted","Data":"acd5ee968b483f9fdf0ec76f30726f5b39f784ec9a160365b3f8be33d7a910a1"} Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.534200 4774 generic.go:334] "Generic (PLEG): container finished" podID="8922cd85-deea-4326-88fc-68c67debf56c" containerID="aef3313280418fbfc839cabcaf85c5e7bf37d120e395b981b70393f30dc222ba" exitCode=0 Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.534546 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hntbw" event={"ID":"8922cd85-deea-4326-88fc-68c67debf56c","Type":"ContainerDied","Data":"aef3313280418fbfc839cabcaf85c5e7bf37d120e395b981b70393f30dc222ba"} Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.534604 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hntbw" event={"ID":"8922cd85-deea-4326-88fc-68c67debf56c","Type":"ContainerStarted","Data":"0725ef5ae026f1a72a3f32f252c278772d86625ece750bcd771ec927412cff13"} Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.538508 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6szd8" event={"ID":"8dfe8164-582f-46df-a0a7-790c7df9e6f9","Type":"ContainerStarted","Data":"410476bd19342eb970cc2a4d94063572cad85cea4ee0e7a326da4993b75a857e"} Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.543553 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zp5qn" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.556658 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kj6jv" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.645284 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mg7ls"] Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.652564 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.652600 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.670143 4774 patch_prober.go:28] interesting pod/router-default-5444994796-s946h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 14:45:37 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Oct 03 14:45:37 crc kubenswrapper[4774]: [+]process-running ok Oct 03 14:45:37 crc kubenswrapper[4774]: healthz check failed Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.670209 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s946h" podUID="67a98087-a2b3-457a-be93-8fd1203b825d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 14:45:37 crc kubenswrapper[4774]: W1003 14:45:37.679721 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2aad74fd_c975_42c4_8b05_ed28dbd55205.slice/crio-3a1f1daa166150bf27f3602be244016c227bc44d51c375a5ba04c569a732e701 WatchSource:0}: Error finding container 3a1f1daa166150bf27f3602be244016c227bc44d51c375a5ba04c569a732e701: Status 404 returned error can't find the container with id 3a1f1daa166150bf27f3602be244016c227bc44d51c375a5ba04c569a732e701 Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.683600 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.722427 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4sjxw"] Oct 03 14:45:37 crc kubenswrapper[4774]: W1003 14:45:37.807940 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d110133_f0c7_4b4c_a060_5a0fdb950e9f.slice/crio-ca2de3af8dd4f1cc3648cdebc623ca2347d2e7f6c54584aec12e2a45da3a95d3 WatchSource:0}: Error finding container ca2de3af8dd4f1cc3648cdebc623ca2347d2e7f6c54584aec12e2a45da3a95d3: Status 404 returned error can't find the container with id ca2de3af8dd4f1cc3648cdebc623ca2347d2e7f6c54584aec12e2a45da3a95d3 Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.916898 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-vnvw7" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.916930 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-vnvw7" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.931777 4774 patch_prober.go:28] interesting pod/console-f9d7485db-vnvw7 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.931852 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-vnvw7" podUID="9cae35f2-fcf0-4014-9b5b-9887d416e8d3" containerName="console" probeResult="failure" output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" Oct 03 14:45:37 crc kubenswrapper[4774]: I1003 14:45:37.945898 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kj6jv"] Oct 03 14:45:37 crc kubenswrapper[4774]: W1003 14:45:37.967828 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea488297_1ed1_475d_8c6a_9b4881dd5583.slice/crio-6fd5f0eafe0fce905e2e0341755601d4ca50392545238b39514f13c26705307b WatchSource:0}: Error finding container 6fd5f0eafe0fce905e2e0341755601d4ca50392545238b39514f13c26705307b: Status 404 returned error can't find the container with id 6fd5f0eafe0fce905e2e0341755601d4ca50392545238b39514f13c26705307b Oct 03 14:45:38 crc kubenswrapper[4774]: I1003 14:45:38.044484 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 03 14:45:38 crc kubenswrapper[4774]: I1003 14:45:38.045322 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 14:45:38 crc kubenswrapper[4774]: I1003 14:45:38.048019 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 03 14:45:38 crc kubenswrapper[4774]: I1003 14:45:38.048679 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 03 14:45:38 crc kubenswrapper[4774]: I1003 14:45:38.051359 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 03 14:45:38 crc kubenswrapper[4774]: I1003 14:45:38.155535 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/995e7b80-23e7-4f81-95b3-a3bc35a125ae-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"995e7b80-23e7-4f81-95b3-a3bc35a125ae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 14:45:38 crc kubenswrapper[4774]: I1003 14:45:38.155611 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/995e7b80-23e7-4f81-95b3-a3bc35a125ae-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"995e7b80-23e7-4f81-95b3-a3bc35a125ae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 14:45:38 crc kubenswrapper[4774]: I1003 14:45:38.257327 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/995e7b80-23e7-4f81-95b3-a3bc35a125ae-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"995e7b80-23e7-4f81-95b3-a3bc35a125ae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 14:45:38 crc kubenswrapper[4774]: I1003 14:45:38.257430 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/995e7b80-23e7-4f81-95b3-a3bc35a125ae-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"995e7b80-23e7-4f81-95b3-a3bc35a125ae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 14:45:38 crc kubenswrapper[4774]: I1003 14:45:38.257511 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/995e7b80-23e7-4f81-95b3-a3bc35a125ae-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"995e7b80-23e7-4f81-95b3-a3bc35a125ae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 14:45:38 crc kubenswrapper[4774]: I1003 14:45:38.283178 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/995e7b80-23e7-4f81-95b3-a3bc35a125ae-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"995e7b80-23e7-4f81-95b3-a3bc35a125ae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 14:45:38 crc kubenswrapper[4774]: I1003 14:45:38.367476 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 14:45:38 crc kubenswrapper[4774]: I1003 14:45:38.547796 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" event={"ID":"7d110133-f0c7-4b4c-a060-5a0fdb950e9f","Type":"ContainerStarted","Data":"f1f700616a1ad94d60892ac87f217e84ce5a9812c3fcc3832fd740a1c6f1e7f2"} Oct 03 14:45:38 crc kubenswrapper[4774]: I1003 14:45:38.548157 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" event={"ID":"7d110133-f0c7-4b4c-a060-5a0fdb950e9f","Type":"ContainerStarted","Data":"ca2de3af8dd4f1cc3648cdebc623ca2347d2e7f6c54584aec12e2a45da3a95d3"} Oct 03 14:45:38 crc kubenswrapper[4774]: I1003 14:45:38.549064 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:38 crc kubenswrapper[4774]: I1003 14:45:38.553788 4774 generic.go:334] "Generic (PLEG): container finished" podID="951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9" containerID="1a02b436e27ea94f0d0c69bf07983b638385eb7dc87c0284146192a5e0740503" exitCode=0 Oct 03 14:45:38 crc kubenswrapper[4774]: I1003 14:45:38.553880 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-8kfbc" event={"ID":"951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9","Type":"ContainerDied","Data":"1a02b436e27ea94f0d0c69bf07983b638385eb7dc87c0284146192a5e0740503"} Oct 03 14:45:38 crc kubenswrapper[4774]: I1003 14:45:38.563320 4774 generic.go:334] "Generic (PLEG): container finished" podID="ea488297-1ed1-475d-8c6a-9b4881dd5583" containerID="7fd2445ccc390b5e5c51155b0065619344aca66ba63722d5a832408e1d5836da" exitCode=0 Oct 03 14:45:38 crc kubenswrapper[4774]: I1003 14:45:38.563573 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kj6jv" event={"ID":"ea488297-1ed1-475d-8c6a-9b4881dd5583","Type":"ContainerDied","Data":"7fd2445ccc390b5e5c51155b0065619344aca66ba63722d5a832408e1d5836da"} Oct 03 14:45:38 crc kubenswrapper[4774]: I1003 14:45:38.563601 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kj6jv" event={"ID":"ea488297-1ed1-475d-8c6a-9b4881dd5583","Type":"ContainerStarted","Data":"6fd5f0eafe0fce905e2e0341755601d4ca50392545238b39514f13c26705307b"} Oct 03 14:45:38 crc kubenswrapper[4774]: I1003 14:45:38.574291 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" podStartSLOduration=133.574271848 podStartE2EDuration="2m13.574271848s" podCreationTimestamp="2025-10-03 14:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:38.566588717 +0000 UTC m=+161.155792189" watchObservedRunningTime="2025-10-03 14:45:38.574271848 +0000 UTC m=+161.163475300" Oct 03 14:45:38 crc kubenswrapper[4774]: I1003 14:45:38.574876 4774 generic.go:334] "Generic (PLEG): container finished" podID="8dfe8164-582f-46df-a0a7-790c7df9e6f9" containerID="addc74e6136e493c31d72f35321f895663600c970ed40e4fdee28cb72976f528" exitCode=0 Oct 03 14:45:38 crc kubenswrapper[4774]: I1003 14:45:38.574948 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6szd8" event={"ID":"8dfe8164-582f-46df-a0a7-790c7df9e6f9","Type":"ContainerDied","Data":"addc74e6136e493c31d72f35321f895663600c970ed40e4fdee28cb72976f528"} Oct 03 14:45:38 crc kubenswrapper[4774]: I1003 14:45:38.583778 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"907dd334-f95b-417d-bbd7-402486f2fcff","Type":"ContainerStarted","Data":"2f10136283806a7ac7fe05cfdfa9d49a5e3967aeaec6242ec75cdc8bc927c3ce"} Oct 03 14:45:38 crc kubenswrapper[4774]: I1003 14:45:38.590989 4774 generic.go:334] "Generic (PLEG): container finished" podID="2aad74fd-c975-42c4-8b05-ed28dbd55205" containerID="241e7b66b12fb0ab7a133a4e3394b1e88df7231028e415a59af033fe842d0eae" exitCode=0 Oct 03 14:45:38 crc kubenswrapper[4774]: I1003 14:45:38.591860 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mg7ls" event={"ID":"2aad74fd-c975-42c4-8b05-ed28dbd55205","Type":"ContainerDied","Data":"241e7b66b12fb0ab7a133a4e3394b1e88df7231028e415a59af033fe842d0eae"} Oct 03 14:45:38 crc kubenswrapper[4774]: I1003 14:45:38.591889 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mg7ls" event={"ID":"2aad74fd-c975-42c4-8b05-ed28dbd55205","Type":"ContainerStarted","Data":"3a1f1daa166150bf27f3602be244016c227bc44d51c375a5ba04c569a732e701"} Oct 03 14:45:38 crc kubenswrapper[4774]: I1003 14:45:38.598238 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-kzdbd" Oct 03 14:45:38 crc kubenswrapper[4774]: I1003 14:45:38.652526 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.652508265 podStartE2EDuration="2.652508265s" podCreationTimestamp="2025-10-03 14:45:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:45:38.640337864 +0000 UTC m=+161.229541316" watchObservedRunningTime="2025-10-03 14:45:38.652508265 +0000 UTC m=+161.241711717" Oct 03 14:45:38 crc kubenswrapper[4774]: I1003 14:45:38.658413 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-s946h" Oct 03 14:45:38 crc kubenswrapper[4774]: I1003 14:45:38.662339 4774 patch_prober.go:28] interesting pod/router-default-5444994796-s946h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 14:45:38 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Oct 03 14:45:38 crc kubenswrapper[4774]: [+]process-running ok Oct 03 14:45:38 crc kubenswrapper[4774]: healthz check failed Oct 03 14:45:38 crc kubenswrapper[4774]: I1003 14:45:38.662392 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s946h" podUID="67a98087-a2b3-457a-be93-8fd1203b825d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 14:45:38 crc kubenswrapper[4774]: I1003 14:45:38.726887 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-wq2sd" Oct 03 14:45:38 crc kubenswrapper[4774]: I1003 14:45:38.935175 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 03 14:45:39 crc kubenswrapper[4774]: I1003 14:45:39.597963 4774 generic.go:334] "Generic (PLEG): container finished" podID="907dd334-f95b-417d-bbd7-402486f2fcff" containerID="2f10136283806a7ac7fe05cfdfa9d49a5e3967aeaec6242ec75cdc8bc927c3ce" exitCode=0 Oct 03 14:45:39 crc kubenswrapper[4774]: I1003 14:45:39.598071 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"907dd334-f95b-417d-bbd7-402486f2fcff","Type":"ContainerDied","Data":"2f10136283806a7ac7fe05cfdfa9d49a5e3967aeaec6242ec75cdc8bc927c3ce"} Oct 03 14:45:39 crc kubenswrapper[4774]: I1003 14:45:39.600076 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"995e7b80-23e7-4f81-95b3-a3bc35a125ae","Type":"ContainerStarted","Data":"a4c3382954fd1a72b5f9bd113491f3deee07e43afdcf67c78c560e54bf4f4c8a"} Oct 03 14:45:39 crc kubenswrapper[4774]: I1003 14:45:39.660730 4774 patch_prober.go:28] interesting pod/router-default-5444994796-s946h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 14:45:39 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Oct 03 14:45:39 crc kubenswrapper[4774]: [+]process-running ok Oct 03 14:45:39 crc kubenswrapper[4774]: healthz check failed Oct 03 14:45:39 crc kubenswrapper[4774]: I1003 14:45:39.661004 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s946h" podUID="67a98087-a2b3-457a-be93-8fd1203b825d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 14:45:39 crc kubenswrapper[4774]: I1003 14:45:39.952011 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-8kfbc" Oct 03 14:45:40 crc kubenswrapper[4774]: I1003 14:45:40.101743 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9-config-volume\") pod \"951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9\" (UID: \"951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9\") " Oct 03 14:45:40 crc kubenswrapper[4774]: I1003 14:45:40.101891 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhpsm\" (UniqueName: \"kubernetes.io/projected/951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9-kube-api-access-jhpsm\") pod \"951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9\" (UID: \"951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9\") " Oct 03 14:45:40 crc kubenswrapper[4774]: I1003 14:45:40.101968 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9-secret-volume\") pod \"951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9\" (UID: \"951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9\") " Oct 03 14:45:40 crc kubenswrapper[4774]: I1003 14:45:40.103135 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9-config-volume" (OuterVolumeSpecName: "config-volume") pod "951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9" (UID: "951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:45:40 crc kubenswrapper[4774]: I1003 14:45:40.108915 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9-kube-api-access-jhpsm" (OuterVolumeSpecName: "kube-api-access-jhpsm") pod "951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9" (UID: "951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9"). InnerVolumeSpecName "kube-api-access-jhpsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:45:40 crc kubenswrapper[4774]: I1003 14:45:40.125364 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9" (UID: "951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:45:40 crc kubenswrapper[4774]: I1003 14:45:40.204279 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhpsm\" (UniqueName: \"kubernetes.io/projected/951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9-kube-api-access-jhpsm\") on node \"crc\" DevicePath \"\"" Oct 03 14:45:40 crc kubenswrapper[4774]: I1003 14:45:40.204316 4774 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 14:45:40 crc kubenswrapper[4774]: I1003 14:45:40.204327 4774 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 14:45:40 crc kubenswrapper[4774]: I1003 14:45:40.627127 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-8kfbc" event={"ID":"951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9","Type":"ContainerDied","Data":"faa04790f5b4b84fe7a6c0618a0ea7f35d3e6ea0cbc5c77d7d346d3f3436b3ed"} Oct 03 14:45:40 crc kubenswrapper[4774]: I1003 14:45:40.627168 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="faa04790f5b4b84fe7a6c0618a0ea7f35d3e6ea0cbc5c77d7d346d3f3436b3ed" Oct 03 14:45:40 crc kubenswrapper[4774]: I1003 14:45:40.627173 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-8kfbc" Oct 03 14:45:40 crc kubenswrapper[4774]: I1003 14:45:40.635233 4774 generic.go:334] "Generic (PLEG): container finished" podID="995e7b80-23e7-4f81-95b3-a3bc35a125ae" containerID="8a72c641d72d4617e931e403b65270f2e241e31d8b58e6358b3ef96387846816" exitCode=0 Oct 03 14:45:40 crc kubenswrapper[4774]: I1003 14:45:40.635286 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"995e7b80-23e7-4f81-95b3-a3bc35a125ae","Type":"ContainerDied","Data":"8a72c641d72d4617e931e403b65270f2e241e31d8b58e6358b3ef96387846816"} Oct 03 14:45:40 crc kubenswrapper[4774]: I1003 14:45:40.661910 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-s946h" Oct 03 14:45:40 crc kubenswrapper[4774]: I1003 14:45:40.665053 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-s946h" Oct 03 14:45:40 crc kubenswrapper[4774]: I1003 14:45:40.957332 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 14:45:41 crc kubenswrapper[4774]: I1003 14:45:41.122071 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/907dd334-f95b-417d-bbd7-402486f2fcff-kube-api-access\") pod \"907dd334-f95b-417d-bbd7-402486f2fcff\" (UID: \"907dd334-f95b-417d-bbd7-402486f2fcff\") " Oct 03 14:45:41 crc kubenswrapper[4774]: I1003 14:45:41.122187 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/907dd334-f95b-417d-bbd7-402486f2fcff-kubelet-dir\") pod \"907dd334-f95b-417d-bbd7-402486f2fcff\" (UID: \"907dd334-f95b-417d-bbd7-402486f2fcff\") " Oct 03 14:45:41 crc kubenswrapper[4774]: I1003 14:45:41.122279 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/907dd334-f95b-417d-bbd7-402486f2fcff-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "907dd334-f95b-417d-bbd7-402486f2fcff" (UID: "907dd334-f95b-417d-bbd7-402486f2fcff"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:45:41 crc kubenswrapper[4774]: I1003 14:45:41.122804 4774 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/907dd334-f95b-417d-bbd7-402486f2fcff-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 03 14:45:41 crc kubenswrapper[4774]: I1003 14:45:41.127270 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/907dd334-f95b-417d-bbd7-402486f2fcff-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "907dd334-f95b-417d-bbd7-402486f2fcff" (UID: "907dd334-f95b-417d-bbd7-402486f2fcff"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:45:41 crc kubenswrapper[4774]: I1003 14:45:41.226035 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/907dd334-f95b-417d-bbd7-402486f2fcff-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 14:45:41 crc kubenswrapper[4774]: I1003 14:45:41.644868 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"907dd334-f95b-417d-bbd7-402486f2fcff","Type":"ContainerDied","Data":"acd5ee968b483f9fdf0ec76f30726f5b39f784ec9a160365b3f8be33d7a910a1"} Oct 03 14:45:41 crc kubenswrapper[4774]: I1003 14:45:41.645195 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acd5ee968b483f9fdf0ec76f30726f5b39f784ec9a160365b3f8be33d7a910a1" Oct 03 14:45:41 crc kubenswrapper[4774]: I1003 14:45:41.644887 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 14:45:43 crc kubenswrapper[4774]: I1003 14:45:43.758027 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-8pqx5" Oct 03 14:45:47 crc kubenswrapper[4774]: I1003 14:45:47.308682 4774 patch_prober.go:28] interesting pod/downloads-7954f5f757-jh8hv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Oct 03 14:45:47 crc kubenswrapper[4774]: I1003 14:45:47.308983 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jh8hv" podUID="40ce0e8a-b3ee-4b5a-a3ef-ee0e4a573496" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.34:8080/\": dial tcp 10.217.0.34:8080: connect: connection refused" Oct 03 14:45:47 crc kubenswrapper[4774]: I1003 14:45:47.308700 4774 patch_prober.go:28] interesting pod/downloads-7954f5f757-jh8hv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.34:8080/\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Oct 03 14:45:47 crc kubenswrapper[4774]: I1003 14:45:47.309394 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-jh8hv" podUID="40ce0e8a-b3ee-4b5a-a3ef-ee0e4a573496" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.34:8080/\": dial tcp 10.217.0.34:8080: connect: connection refused" Oct 03 14:45:47 crc kubenswrapper[4774]: I1003 14:45:47.935971 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-vnvw7" Oct 03 14:45:47 crc kubenswrapper[4774]: I1003 14:45:47.940151 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-vnvw7" Oct 03 14:45:48 crc kubenswrapper[4774]: I1003 14:45:48.146497 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88d3c89f-9fbd-4d50-840a-c5c78528c903-metrics-certs\") pod \"network-metrics-daemon-ghf5t\" (UID: \"88d3c89f-9fbd-4d50-840a-c5c78528c903\") " pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:45:48 crc kubenswrapper[4774]: I1003 14:45:48.152485 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88d3c89f-9fbd-4d50-840a-c5c78528c903-metrics-certs\") pod \"network-metrics-daemon-ghf5t\" (UID: \"88d3c89f-9fbd-4d50-840a-c5c78528c903\") " pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:45:48 crc kubenswrapper[4774]: I1003 14:45:48.316752 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ghf5t" Oct 03 14:45:48 crc kubenswrapper[4774]: I1003 14:45:48.846882 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 14:45:48 crc kubenswrapper[4774]: I1003 14:45:48.853574 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/995e7b80-23e7-4f81-95b3-a3bc35a125ae-kubelet-dir\") pod \"995e7b80-23e7-4f81-95b3-a3bc35a125ae\" (UID: \"995e7b80-23e7-4f81-95b3-a3bc35a125ae\") " Oct 03 14:45:48 crc kubenswrapper[4774]: I1003 14:45:48.853667 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/995e7b80-23e7-4f81-95b3-a3bc35a125ae-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "995e7b80-23e7-4f81-95b3-a3bc35a125ae" (UID: "995e7b80-23e7-4f81-95b3-a3bc35a125ae"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:45:48 crc kubenswrapper[4774]: I1003 14:45:48.854104 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/995e7b80-23e7-4f81-95b3-a3bc35a125ae-kube-api-access\") pod \"995e7b80-23e7-4f81-95b3-a3bc35a125ae\" (UID: \"995e7b80-23e7-4f81-95b3-a3bc35a125ae\") " Oct 03 14:45:48 crc kubenswrapper[4774]: I1003 14:45:48.854455 4774 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/995e7b80-23e7-4f81-95b3-a3bc35a125ae-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 03 14:45:48 crc kubenswrapper[4774]: I1003 14:45:48.866653 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/995e7b80-23e7-4f81-95b3-a3bc35a125ae-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "995e7b80-23e7-4f81-95b3-a3bc35a125ae" (UID: "995e7b80-23e7-4f81-95b3-a3bc35a125ae"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:45:48 crc kubenswrapper[4774]: I1003 14:45:48.956295 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/995e7b80-23e7-4f81-95b3-a3bc35a125ae-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 14:45:49 crc kubenswrapper[4774]: I1003 14:45:49.710595 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"995e7b80-23e7-4f81-95b3-a3bc35a125ae","Type":"ContainerDied","Data":"a4c3382954fd1a72b5f9bd113491f3deee07e43afdcf67c78c560e54bf4f4c8a"} Oct 03 14:45:49 crc kubenswrapper[4774]: I1003 14:45:49.710869 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4c3382954fd1a72b5f9bd113491f3deee07e43afdcf67c78c560e54bf4f4c8a" Oct 03 14:45:49 crc kubenswrapper[4774]: I1003 14:45:49.710706 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 14:45:50 crc kubenswrapper[4774]: I1003 14:45:50.654026 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:45:50 crc kubenswrapper[4774]: I1003 14:45:50.654103 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:45:57 crc kubenswrapper[4774]: I1003 14:45:57.252280 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:45:57 crc kubenswrapper[4774]: I1003 14:45:57.319792 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-jh8hv" Oct 03 14:46:02 crc kubenswrapper[4774]: I1003 14:46:02.703087 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ghf5t"] Oct 03 14:46:08 crc kubenswrapper[4774]: I1003 14:46:08.657817 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jsbrr" Oct 03 14:46:09 crc kubenswrapper[4774]: E1003 14:46:09.603963 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 03 14:46:09 crc kubenswrapper[4774]: E1003 14:46:09.604133 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dswf2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-lvvp2_openshift-marketplace(e4a6d0e1-38db-42e1-8e29-b52de5bbad1d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 14:46:09 crc kubenswrapper[4774]: E1003 14:46:09.605364 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-lvvp2" podUID="e4a6d0e1-38db-42e1-8e29-b52de5bbad1d" Oct 03 14:46:10 crc kubenswrapper[4774]: I1003 14:46:10.569584 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:46:14 crc kubenswrapper[4774]: E1003 14:46:14.434729 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 03 14:46:14 crc kubenswrapper[4774]: E1003 14:46:14.434906 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ndlls,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-2bx5m_openshift-marketplace(062191e1-9f34-4dba-bd1c-9bffe53f5cfd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 14:46:14 crc kubenswrapper[4774]: E1003 14:46:14.436421 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-2bx5m" podUID="062191e1-9f34-4dba-bd1c-9bffe53f5cfd" Oct 03 14:46:15 crc kubenswrapper[4774]: E1003 14:46:15.484664 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 03 14:46:15 crc kubenswrapper[4774]: E1003 14:46:15.484825 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zkmzf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-6p5qb_openshift-marketplace(082e67c6-a87d-48ea-90e3-e20178613597): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 14:46:15 crc kubenswrapper[4774]: E1003 14:46:15.486246 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-6p5qb" podUID="082e67c6-a87d-48ea-90e3-e20178613597" Oct 03 14:46:20 crc kubenswrapper[4774]: I1003 14:46:20.653596 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:46:20 crc kubenswrapper[4774]: I1003 14:46:20.654534 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:46:24 crc kubenswrapper[4774]: W1003 14:46:24.366559 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88d3c89f_9fbd_4d50_840a_c5c78528c903.slice/crio-7b5985c16655df83d1ea6a23c6fe54928cba3487f95ccb2fb40b813d4cd4a7ef WatchSource:0}: Error finding container 7b5985c16655df83d1ea6a23c6fe54928cba3487f95ccb2fb40b813d4cd4a7ef: Status 404 returned error can't find the container with id 7b5985c16655df83d1ea6a23c6fe54928cba3487f95ccb2fb40b813d4cd4a7ef Oct 03 14:46:24 crc kubenswrapper[4774]: E1003 14:46:24.366667 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-6p5qb" podUID="082e67c6-a87d-48ea-90e3-e20178613597" Oct 03 14:46:24 crc kubenswrapper[4774]: E1003 14:46:24.366809 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-2bx5m" podUID="062191e1-9f34-4dba-bd1c-9bffe53f5cfd" Oct 03 14:46:24 crc kubenswrapper[4774]: E1003 14:46:24.450533 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 03 14:46:24 crc kubenswrapper[4774]: E1003 14:46:24.450681 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xxtfx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-kj6jv_openshift-marketplace(ea488297-1ed1-475d-8c6a-9b4881dd5583): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 14:46:24 crc kubenswrapper[4774]: E1003 14:46:24.451857 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-kj6jv" podUID="ea488297-1ed1-475d-8c6a-9b4881dd5583" Oct 03 14:46:24 crc kubenswrapper[4774]: E1003 14:46:24.460590 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 03 14:46:24 crc kubenswrapper[4774]: E1003 14:46:24.460716 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cwp6h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-mg7ls_openshift-marketplace(2aad74fd-c975-42c4-8b05-ed28dbd55205): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 14:46:24 crc kubenswrapper[4774]: E1003 14:46:24.461867 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-mg7ls" podUID="2aad74fd-c975-42c4-8b05-ed28dbd55205" Oct 03 14:46:24 crc kubenswrapper[4774]: E1003 14:46:24.691979 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 03 14:46:24 crc kubenswrapper[4774]: E1003 14:46:24.692438 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-78rsr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-wbncg_openshift-marketplace(8656ee03-e27c-4ba6-a803-ab372ecb9b7b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 14:46:24 crc kubenswrapper[4774]: E1003 14:46:24.693702 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-wbncg" podUID="8656ee03-e27c-4ba6-a803-ab372ecb9b7b" Oct 03 14:46:24 crc kubenswrapper[4774]: I1003 14:46:24.892677 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ghf5t" event={"ID":"88d3c89f-9fbd-4d50-840a-c5c78528c903","Type":"ContainerStarted","Data":"7b5985c16655df83d1ea6a23c6fe54928cba3487f95ccb2fb40b813d4cd4a7ef"} Oct 03 14:46:25 crc kubenswrapper[4774]: E1003 14:46:25.226287 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-kj6jv" podUID="ea488297-1ed1-475d-8c6a-9b4881dd5583" Oct 03 14:46:25 crc kubenswrapper[4774]: E1003 14:46:25.226314 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-mg7ls" podUID="2aad74fd-c975-42c4-8b05-ed28dbd55205" Oct 03 14:46:25 crc kubenswrapper[4774]: E1003 14:46:25.226302 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-wbncg" podUID="8656ee03-e27c-4ba6-a803-ab372ecb9b7b" Oct 03 14:46:25 crc kubenswrapper[4774]: E1003 14:46:25.356924 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 03 14:46:25 crc kubenswrapper[4774]: E1003 14:46:25.357132 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wchl4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-6szd8_openshift-marketplace(8dfe8164-582f-46df-a0a7-790c7df9e6f9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 14:46:25 crc kubenswrapper[4774]: E1003 14:46:25.358284 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-6szd8" podUID="8dfe8164-582f-46df-a0a7-790c7df9e6f9" Oct 03 14:46:25 crc kubenswrapper[4774]: E1003 14:46:25.382047 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 03 14:46:25 crc kubenswrapper[4774]: E1003 14:46:25.382219 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9j9pm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-hntbw_openshift-marketplace(8922cd85-deea-4326-88fc-68c67debf56c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 14:46:25 crc kubenswrapper[4774]: E1003 14:46:25.384400 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-hntbw" podUID="8922cd85-deea-4326-88fc-68c67debf56c" Oct 03 14:46:25 crc kubenswrapper[4774]: I1003 14:46:25.898677 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lvvp2" event={"ID":"e4a6d0e1-38db-42e1-8e29-b52de5bbad1d","Type":"ContainerStarted","Data":"0dce622d016960e7f0e1296173f249ef8cc2d5599efc47aae75a04f9f80fe587"} Oct 03 14:46:25 crc kubenswrapper[4774]: I1003 14:46:25.902765 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ghf5t" event={"ID":"88d3c89f-9fbd-4d50-840a-c5c78528c903","Type":"ContainerStarted","Data":"a27914b881b114b5534cdee56737e01aa8bb20bb454885f6bececc46f3a77858"} Oct 03 14:46:25 crc kubenswrapper[4774]: I1003 14:46:25.902832 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ghf5t" event={"ID":"88d3c89f-9fbd-4d50-840a-c5c78528c903","Type":"ContainerStarted","Data":"796976ab9a298da04c1ad9397e5525a9f6739b35728792bfef2fd5515f313568"} Oct 03 14:46:25 crc kubenswrapper[4774]: E1003 14:46:25.904075 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-hntbw" podUID="8922cd85-deea-4326-88fc-68c67debf56c" Oct 03 14:46:25 crc kubenswrapper[4774]: E1003 14:46:25.904580 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-6szd8" podUID="8dfe8164-582f-46df-a0a7-790c7df9e6f9" Oct 03 14:46:25 crc kubenswrapper[4774]: I1003 14:46:25.974202 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-ghf5t" podStartSLOduration=180.974182995 podStartE2EDuration="3m0.974182995s" podCreationTimestamp="2025-10-03 14:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:46:25.97401699 +0000 UTC m=+208.563220442" watchObservedRunningTime="2025-10-03 14:46:25.974182995 +0000 UTC m=+208.563386467" Oct 03 14:46:26 crc kubenswrapper[4774]: I1003 14:46:26.911888 4774 generic.go:334] "Generic (PLEG): container finished" podID="e4a6d0e1-38db-42e1-8e29-b52de5bbad1d" containerID="0dce622d016960e7f0e1296173f249ef8cc2d5599efc47aae75a04f9f80fe587" exitCode=0 Oct 03 14:46:26 crc kubenswrapper[4774]: I1003 14:46:26.911973 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lvvp2" event={"ID":"e4a6d0e1-38db-42e1-8e29-b52de5bbad1d","Type":"ContainerDied","Data":"0dce622d016960e7f0e1296173f249ef8cc2d5599efc47aae75a04f9f80fe587"} Oct 03 14:46:27 crc kubenswrapper[4774]: I1003 14:46:27.919926 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lvvp2" event={"ID":"e4a6d0e1-38db-42e1-8e29-b52de5bbad1d","Type":"ContainerStarted","Data":"10d22322ea6701471ae96fa67822eef255ecd138cab3a81c70d5dfdce2e58151"} Oct 03 14:46:27 crc kubenswrapper[4774]: I1003 14:46:27.946229 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lvvp2" podStartSLOduration=4.11594478 podStartE2EDuration="54.946204592s" podCreationTimestamp="2025-10-03 14:45:33 +0000 UTC" firstStartedPulling="2025-10-03 14:45:36.496934758 +0000 UTC m=+159.086138210" lastFinishedPulling="2025-10-03 14:46:27.32719456 +0000 UTC m=+209.916398022" observedRunningTime="2025-10-03 14:46:27.944208569 +0000 UTC m=+210.533412081" watchObservedRunningTime="2025-10-03 14:46:27.946204592 +0000 UTC m=+210.535408084" Oct 03 14:46:34 crc kubenswrapper[4774]: I1003 14:46:34.486847 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lvvp2" Oct 03 14:46:34 crc kubenswrapper[4774]: I1003 14:46:34.487444 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lvvp2" Oct 03 14:46:34 crc kubenswrapper[4774]: I1003 14:46:34.866554 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lvvp2" Oct 03 14:46:34 crc kubenswrapper[4774]: I1003 14:46:34.995604 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lvvp2" Oct 03 14:46:39 crc kubenswrapper[4774]: I1003 14:46:39.986414 4774 generic.go:334] "Generic (PLEG): container finished" podID="ea488297-1ed1-475d-8c6a-9b4881dd5583" containerID="7511540248eed3cf2f2e6a6a37158a3f16d34542b7203bcbb7697009833bd9af" exitCode=0 Oct 03 14:46:39 crc kubenswrapper[4774]: I1003 14:46:39.986484 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kj6jv" event={"ID":"ea488297-1ed1-475d-8c6a-9b4881dd5583","Type":"ContainerDied","Data":"7511540248eed3cf2f2e6a6a37158a3f16d34542b7203bcbb7697009833bd9af"} Oct 03 14:46:39 crc kubenswrapper[4774]: I1003 14:46:39.990031 4774 generic.go:334] "Generic (PLEG): container finished" podID="082e67c6-a87d-48ea-90e3-e20178613597" containerID="cff9293a3a7cdf5e4e770c8ed9ba2357c6ceea33672a623489c8127ebf6a9871" exitCode=0 Oct 03 14:46:39 crc kubenswrapper[4774]: I1003 14:46:39.990102 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6p5qb" event={"ID":"082e67c6-a87d-48ea-90e3-e20178613597","Type":"ContainerDied","Data":"cff9293a3a7cdf5e4e770c8ed9ba2357c6ceea33672a623489c8127ebf6a9871"} Oct 03 14:46:39 crc kubenswrapper[4774]: I1003 14:46:39.992821 4774 generic.go:334] "Generic (PLEG): container finished" podID="8dfe8164-582f-46df-a0a7-790c7df9e6f9" containerID="02bad132f351c10a8594fd5edcc4aa03929f11b5f25aa8c578aab9663bf4c641" exitCode=0 Oct 03 14:46:39 crc kubenswrapper[4774]: I1003 14:46:39.992848 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6szd8" event={"ID":"8dfe8164-582f-46df-a0a7-790c7df9e6f9","Type":"ContainerDied","Data":"02bad132f351c10a8594fd5edcc4aa03929f11b5f25aa8c578aab9663bf4c641"} Oct 03 14:46:41 crc kubenswrapper[4774]: I1003 14:46:41.002094 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kj6jv" event={"ID":"ea488297-1ed1-475d-8c6a-9b4881dd5583","Type":"ContainerStarted","Data":"57bcafe6177ef6fbedb031d4be4233c22bdc57700beb1ac7e4e1f4a2a3ace33f"} Oct 03 14:46:41 crc kubenswrapper[4774]: I1003 14:46:41.005542 4774 generic.go:334] "Generic (PLEG): container finished" podID="062191e1-9f34-4dba-bd1c-9bffe53f5cfd" containerID="6e6f5e7b98a3f52922cb4a4e351aae146118381748560093cc1ff42558d528b3" exitCode=0 Oct 03 14:46:41 crc kubenswrapper[4774]: I1003 14:46:41.005630 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2bx5m" event={"ID":"062191e1-9f34-4dba-bd1c-9bffe53f5cfd","Type":"ContainerDied","Data":"6e6f5e7b98a3f52922cb4a4e351aae146118381748560093cc1ff42558d528b3"} Oct 03 14:46:41 crc kubenswrapper[4774]: I1003 14:46:41.007560 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6p5qb" event={"ID":"082e67c6-a87d-48ea-90e3-e20178613597","Type":"ContainerStarted","Data":"43ae7255341a7c9b51d6860fc814a0483912cf72040233bc5fcf086250c90693"} Oct 03 14:46:41 crc kubenswrapper[4774]: I1003 14:46:41.010060 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6szd8" event={"ID":"8dfe8164-582f-46df-a0a7-790c7df9e6f9","Type":"ContainerStarted","Data":"50e1b199f8c5ffec1c65e8fc4c5d64a06221b40d733818a08a69ca18806a88b0"} Oct 03 14:46:41 crc kubenswrapper[4774]: I1003 14:46:41.029149 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kj6jv" podStartSLOduration=2.212486164 podStartE2EDuration="1m4.029130456s" podCreationTimestamp="2025-10-03 14:45:37 +0000 UTC" firstStartedPulling="2025-10-03 14:45:38.580384969 +0000 UTC m=+161.169588421" lastFinishedPulling="2025-10-03 14:46:40.397029261 +0000 UTC m=+222.986232713" observedRunningTime="2025-10-03 14:46:41.027705241 +0000 UTC m=+223.616908703" watchObservedRunningTime="2025-10-03 14:46:41.029130456 +0000 UTC m=+223.618333908" Oct 03 14:46:41 crc kubenswrapper[4774]: I1003 14:46:41.071214 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6szd8" podStartSLOduration=3.103983445 podStartE2EDuration="1m5.071194154s" podCreationTimestamp="2025-10-03 14:45:36 +0000 UTC" firstStartedPulling="2025-10-03 14:45:38.585019584 +0000 UTC m=+161.174223026" lastFinishedPulling="2025-10-03 14:46:40.552230283 +0000 UTC m=+223.141433735" observedRunningTime="2025-10-03 14:46:41.064805792 +0000 UTC m=+223.654009244" watchObservedRunningTime="2025-10-03 14:46:41.071194154 +0000 UTC m=+223.660397606" Oct 03 14:46:41 crc kubenswrapper[4774]: I1003 14:46:41.083341 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6p5qb" podStartSLOduration=3.138703069 podStartE2EDuration="1m7.083321587s" podCreationTimestamp="2025-10-03 14:45:34 +0000 UTC" firstStartedPulling="2025-10-03 14:45:36.49155775 +0000 UTC m=+159.080761202" lastFinishedPulling="2025-10-03 14:46:40.436176268 +0000 UTC m=+223.025379720" observedRunningTime="2025-10-03 14:46:41.08023294 +0000 UTC m=+223.669436402" watchObservedRunningTime="2025-10-03 14:46:41.083321587 +0000 UTC m=+223.672525029" Oct 03 14:46:41 crc kubenswrapper[4774]: E1003 14:46:41.397840 4774 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8656ee03_e27c_4ba6_a803_ab372ecb9b7b.slice/crio-conmon-22b02c966c08ad3fa17ffa98ea3eec18da00a1616ca69d7d3d851e04e702ca19.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8656ee03_e27c_4ba6_a803_ab372ecb9b7b.slice/crio-22b02c966c08ad3fa17ffa98ea3eec18da00a1616ca69d7d3d851e04e702ca19.scope\": RecentStats: unable to find data in memory cache]" Oct 03 14:46:42 crc kubenswrapper[4774]: I1003 14:46:42.015842 4774 generic.go:334] "Generic (PLEG): container finished" podID="8656ee03-e27c-4ba6-a803-ab372ecb9b7b" containerID="22b02c966c08ad3fa17ffa98ea3eec18da00a1616ca69d7d3d851e04e702ca19" exitCode=0 Oct 03 14:46:42 crc kubenswrapper[4774]: I1003 14:46:42.015927 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbncg" event={"ID":"8656ee03-e27c-4ba6-a803-ab372ecb9b7b","Type":"ContainerDied","Data":"22b02c966c08ad3fa17ffa98ea3eec18da00a1616ca69d7d3d851e04e702ca19"} Oct 03 14:46:42 crc kubenswrapper[4774]: I1003 14:46:42.020300 4774 generic.go:334] "Generic (PLEG): container finished" podID="8922cd85-deea-4326-88fc-68c67debf56c" containerID="0d64e788760602d93c85ee1d770d03e0f304106d4749502c1b9a6e5a1936dfd6" exitCode=0 Oct 03 14:46:42 crc kubenswrapper[4774]: I1003 14:46:42.020404 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hntbw" event={"ID":"8922cd85-deea-4326-88fc-68c67debf56c","Type":"ContainerDied","Data":"0d64e788760602d93c85ee1d770d03e0f304106d4749502c1b9a6e5a1936dfd6"} Oct 03 14:46:42 crc kubenswrapper[4774]: I1003 14:46:42.022504 4774 generic.go:334] "Generic (PLEG): container finished" podID="2aad74fd-c975-42c4-8b05-ed28dbd55205" containerID="b223d514ad97cb3e3ca217b60e774a83b4505e236401d379396d7f9cbe15237c" exitCode=0 Oct 03 14:46:42 crc kubenswrapper[4774]: I1003 14:46:42.022594 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mg7ls" event={"ID":"2aad74fd-c975-42c4-8b05-ed28dbd55205","Type":"ContainerDied","Data":"b223d514ad97cb3e3ca217b60e774a83b4505e236401d379396d7f9cbe15237c"} Oct 03 14:46:42 crc kubenswrapper[4774]: I1003 14:46:42.034786 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2bx5m" event={"ID":"062191e1-9f34-4dba-bd1c-9bffe53f5cfd","Type":"ContainerStarted","Data":"1cc64d06fa501089dae4c94b6c5806c657e04d76020fda78779165c881d12b52"} Oct 03 14:46:42 crc kubenswrapper[4774]: I1003 14:46:42.089639 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2bx5m" podStartSLOduration=3.071624743 podStartE2EDuration="1m8.089599992s" podCreationTimestamp="2025-10-03 14:45:34 +0000 UTC" firstStartedPulling="2025-10-03 14:45:36.488422322 +0000 UTC m=+159.077625774" lastFinishedPulling="2025-10-03 14:46:41.506397571 +0000 UTC m=+224.095601023" observedRunningTime="2025-10-03 14:46:42.06262576 +0000 UTC m=+224.651829212" watchObservedRunningTime="2025-10-03 14:46:42.089599992 +0000 UTC m=+224.678803444" Oct 03 14:46:43 crc kubenswrapper[4774]: I1003 14:46:43.044399 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mg7ls" event={"ID":"2aad74fd-c975-42c4-8b05-ed28dbd55205","Type":"ContainerStarted","Data":"19ce9cd779ae623063419f702b4eac661c79d2dc10203bfa62c52c8e2683272f"} Oct 03 14:46:43 crc kubenswrapper[4774]: I1003 14:46:43.046413 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbncg" event={"ID":"8656ee03-e27c-4ba6-a803-ab372ecb9b7b","Type":"ContainerStarted","Data":"0e030a7effaae6d6ae265a1cefddc9dfa03e397c913f1ed3b6480b89dcaeafb0"} Oct 03 14:46:43 crc kubenswrapper[4774]: I1003 14:46:43.047949 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hntbw" event={"ID":"8922cd85-deea-4326-88fc-68c67debf56c","Type":"ContainerStarted","Data":"6b79066e99b1763fd752878b0f5e706ee85a60e59e05995ea2b0b37760923ae5"} Oct 03 14:46:43 crc kubenswrapper[4774]: I1003 14:46:43.062262 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mg7ls" podStartSLOduration=2.227937439 podStartE2EDuration="1m6.062244924s" podCreationTimestamp="2025-10-03 14:45:37 +0000 UTC" firstStartedPulling="2025-10-03 14:45:38.593657434 +0000 UTC m=+161.182860886" lastFinishedPulling="2025-10-03 14:46:42.427964909 +0000 UTC m=+225.017168371" observedRunningTime="2025-10-03 14:46:43.060610252 +0000 UTC m=+225.649813724" watchObservedRunningTime="2025-10-03 14:46:43.062244924 +0000 UTC m=+225.651448376" Oct 03 14:46:43 crc kubenswrapper[4774]: I1003 14:46:43.081793 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hntbw" podStartSLOduration=1.7255044609999999 podStartE2EDuration="1m7.081776691s" podCreationTimestamp="2025-10-03 14:45:36 +0000 UTC" firstStartedPulling="2025-10-03 14:45:37.537601365 +0000 UTC m=+160.126804827" lastFinishedPulling="2025-10-03 14:46:42.893873605 +0000 UTC m=+225.483077057" observedRunningTime="2025-10-03 14:46:43.081073828 +0000 UTC m=+225.670277290" watchObservedRunningTime="2025-10-03 14:46:43.081776691 +0000 UTC m=+225.670980143" Oct 03 14:46:43 crc kubenswrapper[4774]: I1003 14:46:43.098691 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wbncg" podStartSLOduration=3.081211384 podStartE2EDuration="1m9.098670894s" podCreationTimestamp="2025-10-03 14:45:34 +0000 UTC" firstStartedPulling="2025-10-03 14:45:36.494326717 +0000 UTC m=+159.083530169" lastFinishedPulling="2025-10-03 14:46:42.511786227 +0000 UTC m=+225.100989679" observedRunningTime="2025-10-03 14:46:43.097147316 +0000 UTC m=+225.686350768" watchObservedRunningTime="2025-10-03 14:46:43.098670894 +0000 UTC m=+225.687874346" Oct 03 14:46:44 crc kubenswrapper[4774]: I1003 14:46:44.433644 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2bx5m" Oct 03 14:46:44 crc kubenswrapper[4774]: I1003 14:46:44.434607 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2bx5m" Oct 03 14:46:44 crc kubenswrapper[4774]: I1003 14:46:44.486683 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2bx5m" Oct 03 14:46:44 crc kubenswrapper[4774]: I1003 14:46:44.606109 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wbncg" Oct 03 14:46:44 crc kubenswrapper[4774]: I1003 14:46:44.606152 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wbncg" Oct 03 14:46:44 crc kubenswrapper[4774]: I1003 14:46:44.656304 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wbncg" Oct 03 14:46:44 crc kubenswrapper[4774]: I1003 14:46:44.910363 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6p5qb" Oct 03 14:46:44 crc kubenswrapper[4774]: I1003 14:46:44.910438 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6p5qb" Oct 03 14:46:44 crc kubenswrapper[4774]: I1003 14:46:44.949774 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6p5qb" Oct 03 14:46:45 crc kubenswrapper[4774]: I1003 14:46:45.097589 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6p5qb" Oct 03 14:46:46 crc kubenswrapper[4774]: I1003 14:46:46.131336 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2bx5m" Oct 03 14:46:46 crc kubenswrapper[4774]: I1003 14:46:46.549764 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hntbw" Oct 03 14:46:46 crc kubenswrapper[4774]: I1003 14:46:46.549841 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hntbw" Oct 03 14:46:46 crc kubenswrapper[4774]: I1003 14:46:46.606751 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hntbw" Oct 03 14:46:46 crc kubenswrapper[4774]: I1003 14:46:46.979623 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6szd8" Oct 03 14:46:46 crc kubenswrapper[4774]: I1003 14:46:46.979733 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6szd8" Oct 03 14:46:47 crc kubenswrapper[4774]: I1003 14:46:47.020503 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6szd8" Oct 03 14:46:47 crc kubenswrapper[4774]: I1003 14:46:47.123457 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6szd8" Oct 03 14:46:47 crc kubenswrapper[4774]: I1003 14:46:47.348791 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mg7ls" Oct 03 14:46:47 crc kubenswrapper[4774]: I1003 14:46:47.348873 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mg7ls" Oct 03 14:46:47 crc kubenswrapper[4774]: I1003 14:46:47.389847 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mg7ls" Oct 03 14:46:47 crc kubenswrapper[4774]: I1003 14:46:47.470007 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6p5qb"] Oct 03 14:46:47 crc kubenswrapper[4774]: I1003 14:46:47.470281 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6p5qb" podUID="082e67c6-a87d-48ea-90e3-e20178613597" containerName="registry-server" containerID="cri-o://43ae7255341a7c9b51d6860fc814a0483912cf72040233bc5fcf086250c90693" gracePeriod=2 Oct 03 14:46:47 crc kubenswrapper[4774]: I1003 14:46:47.558217 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kj6jv" Oct 03 14:46:47 crc kubenswrapper[4774]: I1003 14:46:47.558273 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kj6jv" Oct 03 14:46:47 crc kubenswrapper[4774]: I1003 14:46:47.621033 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kj6jv" Oct 03 14:46:48 crc kubenswrapper[4774]: I1003 14:46:48.141770 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mg7ls" Oct 03 14:46:48 crc kubenswrapper[4774]: I1003 14:46:48.143943 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kj6jv" Oct 03 14:46:49 crc kubenswrapper[4774]: I1003 14:46:49.094779 4774 generic.go:334] "Generic (PLEG): container finished" podID="082e67c6-a87d-48ea-90e3-e20178613597" containerID="43ae7255341a7c9b51d6860fc814a0483912cf72040233bc5fcf086250c90693" exitCode=0 Oct 03 14:46:49 crc kubenswrapper[4774]: I1003 14:46:49.095987 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6p5qb" event={"ID":"082e67c6-a87d-48ea-90e3-e20178613597","Type":"ContainerDied","Data":"43ae7255341a7c9b51d6860fc814a0483912cf72040233bc5fcf086250c90693"} Oct 03 14:46:49 crc kubenswrapper[4774]: I1003 14:46:49.294146 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6p5qb" Oct 03 14:46:49 crc kubenswrapper[4774]: I1003 14:46:49.372568 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkmzf\" (UniqueName: \"kubernetes.io/projected/082e67c6-a87d-48ea-90e3-e20178613597-kube-api-access-zkmzf\") pod \"082e67c6-a87d-48ea-90e3-e20178613597\" (UID: \"082e67c6-a87d-48ea-90e3-e20178613597\") " Oct 03 14:46:49 crc kubenswrapper[4774]: I1003 14:46:49.372752 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/082e67c6-a87d-48ea-90e3-e20178613597-utilities\") pod \"082e67c6-a87d-48ea-90e3-e20178613597\" (UID: \"082e67c6-a87d-48ea-90e3-e20178613597\") " Oct 03 14:46:49 crc kubenswrapper[4774]: I1003 14:46:49.372794 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/082e67c6-a87d-48ea-90e3-e20178613597-catalog-content\") pod \"082e67c6-a87d-48ea-90e3-e20178613597\" (UID: \"082e67c6-a87d-48ea-90e3-e20178613597\") " Oct 03 14:46:49 crc kubenswrapper[4774]: I1003 14:46:49.375607 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/082e67c6-a87d-48ea-90e3-e20178613597-utilities" (OuterVolumeSpecName: "utilities") pod "082e67c6-a87d-48ea-90e3-e20178613597" (UID: "082e67c6-a87d-48ea-90e3-e20178613597"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:46:49 crc kubenswrapper[4774]: I1003 14:46:49.379226 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/082e67c6-a87d-48ea-90e3-e20178613597-kube-api-access-zkmzf" (OuterVolumeSpecName: "kube-api-access-zkmzf") pod "082e67c6-a87d-48ea-90e3-e20178613597" (UID: "082e67c6-a87d-48ea-90e3-e20178613597"). InnerVolumeSpecName "kube-api-access-zkmzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:46:49 crc kubenswrapper[4774]: I1003 14:46:49.422340 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/082e67c6-a87d-48ea-90e3-e20178613597-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "082e67c6-a87d-48ea-90e3-e20178613597" (UID: "082e67c6-a87d-48ea-90e3-e20178613597"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:46:49 crc kubenswrapper[4774]: I1003 14:46:49.474714 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkmzf\" (UniqueName: \"kubernetes.io/projected/082e67c6-a87d-48ea-90e3-e20178613597-kube-api-access-zkmzf\") on node \"crc\" DevicePath \"\"" Oct 03 14:46:49 crc kubenswrapper[4774]: I1003 14:46:49.474765 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/082e67c6-a87d-48ea-90e3-e20178613597-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:46:49 crc kubenswrapper[4774]: I1003 14:46:49.474785 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/082e67c6-a87d-48ea-90e3-e20178613597-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:46:49 crc kubenswrapper[4774]: I1003 14:46:49.669303 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6szd8"] Oct 03 14:46:49 crc kubenswrapper[4774]: I1003 14:46:49.670069 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6szd8" podUID="8dfe8164-582f-46df-a0a7-790c7df9e6f9" containerName="registry-server" containerID="cri-o://50e1b199f8c5ffec1c65e8fc4c5d64a06221b40d733818a08a69ca18806a88b0" gracePeriod=2 Oct 03 14:46:49 crc kubenswrapper[4774]: I1003 14:46:49.870976 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kj6jv"] Oct 03 14:46:49 crc kubenswrapper[4774]: I1003 14:46:49.945924 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6szd8" Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.084180 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dfe8164-582f-46df-a0a7-790c7df9e6f9-utilities\") pod \"8dfe8164-582f-46df-a0a7-790c7df9e6f9\" (UID: \"8dfe8164-582f-46df-a0a7-790c7df9e6f9\") " Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.084268 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wchl4\" (UniqueName: \"kubernetes.io/projected/8dfe8164-582f-46df-a0a7-790c7df9e6f9-kube-api-access-wchl4\") pod \"8dfe8164-582f-46df-a0a7-790c7df9e6f9\" (UID: \"8dfe8164-582f-46df-a0a7-790c7df9e6f9\") " Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.084478 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dfe8164-582f-46df-a0a7-790c7df9e6f9-catalog-content\") pod \"8dfe8164-582f-46df-a0a7-790c7df9e6f9\" (UID: \"8dfe8164-582f-46df-a0a7-790c7df9e6f9\") " Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.085121 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dfe8164-582f-46df-a0a7-790c7df9e6f9-utilities" (OuterVolumeSpecName: "utilities") pod "8dfe8164-582f-46df-a0a7-790c7df9e6f9" (UID: "8dfe8164-582f-46df-a0a7-790c7df9e6f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.088800 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dfe8164-582f-46df-a0a7-790c7df9e6f9-kube-api-access-wchl4" (OuterVolumeSpecName: "kube-api-access-wchl4") pod "8dfe8164-582f-46df-a0a7-790c7df9e6f9" (UID: "8dfe8164-582f-46df-a0a7-790c7df9e6f9"). InnerVolumeSpecName "kube-api-access-wchl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.102059 4774 generic.go:334] "Generic (PLEG): container finished" podID="8dfe8164-582f-46df-a0a7-790c7df9e6f9" containerID="50e1b199f8c5ffec1c65e8fc4c5d64a06221b40d733818a08a69ca18806a88b0" exitCode=0 Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.102130 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6szd8" event={"ID":"8dfe8164-582f-46df-a0a7-790c7df9e6f9","Type":"ContainerDied","Data":"50e1b199f8c5ffec1c65e8fc4c5d64a06221b40d733818a08a69ca18806a88b0"} Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.102168 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6szd8" Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.102192 4774 scope.go:117] "RemoveContainer" containerID="50e1b199f8c5ffec1c65e8fc4c5d64a06221b40d733818a08a69ca18806a88b0" Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.102179 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6szd8" event={"ID":"8dfe8164-582f-46df-a0a7-790c7df9e6f9","Type":"ContainerDied","Data":"410476bd19342eb970cc2a4d94063572cad85cea4ee0e7a326da4993b75a857e"} Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.104284 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6p5qb" Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.104340 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6p5qb" event={"ID":"082e67c6-a87d-48ea-90e3-e20178613597","Type":"ContainerDied","Data":"59a6617e964a4e8ba76b9abc3b1dba2231af5975de0a554348f1c9516fbe0ce2"} Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.104419 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kj6jv" podUID="ea488297-1ed1-475d-8c6a-9b4881dd5583" containerName="registry-server" containerID="cri-o://57bcafe6177ef6fbedb031d4be4233c22bdc57700beb1ac7e4e1f4a2a3ace33f" gracePeriod=2 Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.106018 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dfe8164-582f-46df-a0a7-790c7df9e6f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8dfe8164-582f-46df-a0a7-790c7df9e6f9" (UID: "8dfe8164-582f-46df-a0a7-790c7df9e6f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.120496 4774 scope.go:117] "RemoveContainer" containerID="02bad132f351c10a8594fd5edcc4aa03929f11b5f25aa8c578aab9663bf4c641" Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.130620 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6p5qb"] Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.133692 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6p5qb"] Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.137209 4774 scope.go:117] "RemoveContainer" containerID="addc74e6136e493c31d72f35321f895663600c970ed40e4fdee28cb72976f528" Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.150675 4774 scope.go:117] "RemoveContainer" containerID="50e1b199f8c5ffec1c65e8fc4c5d64a06221b40d733818a08a69ca18806a88b0" Oct 03 14:46:50 crc kubenswrapper[4774]: E1003 14:46:50.151150 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50e1b199f8c5ffec1c65e8fc4c5d64a06221b40d733818a08a69ca18806a88b0\": container with ID starting with 50e1b199f8c5ffec1c65e8fc4c5d64a06221b40d733818a08a69ca18806a88b0 not found: ID does not exist" containerID="50e1b199f8c5ffec1c65e8fc4c5d64a06221b40d733818a08a69ca18806a88b0" Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.151185 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50e1b199f8c5ffec1c65e8fc4c5d64a06221b40d733818a08a69ca18806a88b0"} err="failed to get container status \"50e1b199f8c5ffec1c65e8fc4c5d64a06221b40d733818a08a69ca18806a88b0\": rpc error: code = NotFound desc = could not find container \"50e1b199f8c5ffec1c65e8fc4c5d64a06221b40d733818a08a69ca18806a88b0\": container with ID starting with 50e1b199f8c5ffec1c65e8fc4c5d64a06221b40d733818a08a69ca18806a88b0 not found: ID does not exist" Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.151231 4774 scope.go:117] "RemoveContainer" containerID="02bad132f351c10a8594fd5edcc4aa03929f11b5f25aa8c578aab9663bf4c641" Oct 03 14:46:50 crc kubenswrapper[4774]: E1003 14:46:50.151568 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02bad132f351c10a8594fd5edcc4aa03929f11b5f25aa8c578aab9663bf4c641\": container with ID starting with 02bad132f351c10a8594fd5edcc4aa03929f11b5f25aa8c578aab9663bf4c641 not found: ID does not exist" containerID="02bad132f351c10a8594fd5edcc4aa03929f11b5f25aa8c578aab9663bf4c641" Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.151602 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02bad132f351c10a8594fd5edcc4aa03929f11b5f25aa8c578aab9663bf4c641"} err="failed to get container status \"02bad132f351c10a8594fd5edcc4aa03929f11b5f25aa8c578aab9663bf4c641\": rpc error: code = NotFound desc = could not find container \"02bad132f351c10a8594fd5edcc4aa03929f11b5f25aa8c578aab9663bf4c641\": container with ID starting with 02bad132f351c10a8594fd5edcc4aa03929f11b5f25aa8c578aab9663bf4c641 not found: ID does not exist" Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.151622 4774 scope.go:117] "RemoveContainer" containerID="addc74e6136e493c31d72f35321f895663600c970ed40e4fdee28cb72976f528" Oct 03 14:46:50 crc kubenswrapper[4774]: E1003 14:46:50.151914 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"addc74e6136e493c31d72f35321f895663600c970ed40e4fdee28cb72976f528\": container with ID starting with addc74e6136e493c31d72f35321f895663600c970ed40e4fdee28cb72976f528 not found: ID does not exist" containerID="addc74e6136e493c31d72f35321f895663600c970ed40e4fdee28cb72976f528" Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.151958 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"addc74e6136e493c31d72f35321f895663600c970ed40e4fdee28cb72976f528"} err="failed to get container status \"addc74e6136e493c31d72f35321f895663600c970ed40e4fdee28cb72976f528\": rpc error: code = NotFound desc = could not find container \"addc74e6136e493c31d72f35321f895663600c970ed40e4fdee28cb72976f528\": container with ID starting with addc74e6136e493c31d72f35321f895663600c970ed40e4fdee28cb72976f528 not found: ID does not exist" Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.151985 4774 scope.go:117] "RemoveContainer" containerID="43ae7255341a7c9b51d6860fc814a0483912cf72040233bc5fcf086250c90693" Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.169289 4774 scope.go:117] "RemoveContainer" containerID="cff9293a3a7cdf5e4e770c8ed9ba2357c6ceea33672a623489c8127ebf6a9871" Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.185691 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dfe8164-582f-46df-a0a7-790c7df9e6f9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.185742 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dfe8164-582f-46df-a0a7-790c7df9e6f9-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.185758 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wchl4\" (UniqueName: \"kubernetes.io/projected/8dfe8164-582f-46df-a0a7-790c7df9e6f9-kube-api-access-wchl4\") on node \"crc\" DevicePath \"\"" Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.233513 4774 scope.go:117] "RemoveContainer" containerID="e67058ff9e92987338d048ace401510a23b337c397b1507ea97d5e35a091521c" Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.436318 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kj6jv" Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.445329 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6szd8"] Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.447600 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6szd8"] Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.589815 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxtfx\" (UniqueName: \"kubernetes.io/projected/ea488297-1ed1-475d-8c6a-9b4881dd5583-kube-api-access-xxtfx\") pod \"ea488297-1ed1-475d-8c6a-9b4881dd5583\" (UID: \"ea488297-1ed1-475d-8c6a-9b4881dd5583\") " Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.589894 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea488297-1ed1-475d-8c6a-9b4881dd5583-catalog-content\") pod \"ea488297-1ed1-475d-8c6a-9b4881dd5583\" (UID: \"ea488297-1ed1-475d-8c6a-9b4881dd5583\") " Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.589973 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea488297-1ed1-475d-8c6a-9b4881dd5583-utilities\") pod \"ea488297-1ed1-475d-8c6a-9b4881dd5583\" (UID: \"ea488297-1ed1-475d-8c6a-9b4881dd5583\") " Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.591101 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea488297-1ed1-475d-8c6a-9b4881dd5583-utilities" (OuterVolumeSpecName: "utilities") pod "ea488297-1ed1-475d-8c6a-9b4881dd5583" (UID: "ea488297-1ed1-475d-8c6a-9b4881dd5583"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.596453 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea488297-1ed1-475d-8c6a-9b4881dd5583-kube-api-access-xxtfx" (OuterVolumeSpecName: "kube-api-access-xxtfx") pod "ea488297-1ed1-475d-8c6a-9b4881dd5583" (UID: "ea488297-1ed1-475d-8c6a-9b4881dd5583"). InnerVolumeSpecName "kube-api-access-xxtfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.653406 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.653512 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.653597 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.654461 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055"} pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.654541 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" containerID="cri-o://bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055" gracePeriod=600 Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.691428 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxtfx\" (UniqueName: \"kubernetes.io/projected/ea488297-1ed1-475d-8c6a-9b4881dd5583-kube-api-access-xxtfx\") on node \"crc\" DevicePath \"\"" Oct 03 14:46:50 crc kubenswrapper[4774]: I1003 14:46:50.691485 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea488297-1ed1-475d-8c6a-9b4881dd5583-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:46:51 crc kubenswrapper[4774]: I1003 14:46:51.114538 4774 generic.go:334] "Generic (PLEG): container finished" podID="ea488297-1ed1-475d-8c6a-9b4881dd5583" containerID="57bcafe6177ef6fbedb031d4be4233c22bdc57700beb1ac7e4e1f4a2a3ace33f" exitCode=0 Oct 03 14:46:51 crc kubenswrapper[4774]: I1003 14:46:51.114618 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kj6jv" event={"ID":"ea488297-1ed1-475d-8c6a-9b4881dd5583","Type":"ContainerDied","Data":"57bcafe6177ef6fbedb031d4be4233c22bdc57700beb1ac7e4e1f4a2a3ace33f"} Oct 03 14:46:51 crc kubenswrapper[4774]: I1003 14:46:51.114646 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kj6jv" event={"ID":"ea488297-1ed1-475d-8c6a-9b4881dd5583","Type":"ContainerDied","Data":"6fd5f0eafe0fce905e2e0341755601d4ca50392545238b39514f13c26705307b"} Oct 03 14:46:51 crc kubenswrapper[4774]: I1003 14:46:51.114667 4774 scope.go:117] "RemoveContainer" containerID="57bcafe6177ef6fbedb031d4be4233c22bdc57700beb1ac7e4e1f4a2a3ace33f" Oct 03 14:46:51 crc kubenswrapper[4774]: I1003 14:46:51.114855 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kj6jv" Oct 03 14:46:51 crc kubenswrapper[4774]: I1003 14:46:51.135361 4774 scope.go:117] "RemoveContainer" containerID="7511540248eed3cf2f2e6a6a37158a3f16d34542b7203bcbb7697009833bd9af" Oct 03 14:46:51 crc kubenswrapper[4774]: I1003 14:46:51.159476 4774 scope.go:117] "RemoveContainer" containerID="7fd2445ccc390b5e5c51155b0065619344aca66ba63722d5a832408e1d5836da" Oct 03 14:46:51 crc kubenswrapper[4774]: I1003 14:46:51.171946 4774 scope.go:117] "RemoveContainer" containerID="57bcafe6177ef6fbedb031d4be4233c22bdc57700beb1ac7e4e1f4a2a3ace33f" Oct 03 14:46:51 crc kubenswrapper[4774]: E1003 14:46:51.172622 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57bcafe6177ef6fbedb031d4be4233c22bdc57700beb1ac7e4e1f4a2a3ace33f\": container with ID starting with 57bcafe6177ef6fbedb031d4be4233c22bdc57700beb1ac7e4e1f4a2a3ace33f not found: ID does not exist" containerID="57bcafe6177ef6fbedb031d4be4233c22bdc57700beb1ac7e4e1f4a2a3ace33f" Oct 03 14:46:51 crc kubenswrapper[4774]: I1003 14:46:51.172668 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57bcafe6177ef6fbedb031d4be4233c22bdc57700beb1ac7e4e1f4a2a3ace33f"} err="failed to get container status \"57bcafe6177ef6fbedb031d4be4233c22bdc57700beb1ac7e4e1f4a2a3ace33f\": rpc error: code = NotFound desc = could not find container \"57bcafe6177ef6fbedb031d4be4233c22bdc57700beb1ac7e4e1f4a2a3ace33f\": container with ID starting with 57bcafe6177ef6fbedb031d4be4233c22bdc57700beb1ac7e4e1f4a2a3ace33f not found: ID does not exist" Oct 03 14:46:51 crc kubenswrapper[4774]: I1003 14:46:51.172695 4774 scope.go:117] "RemoveContainer" containerID="7511540248eed3cf2f2e6a6a37158a3f16d34542b7203bcbb7697009833bd9af" Oct 03 14:46:51 crc kubenswrapper[4774]: E1003 14:46:51.173156 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7511540248eed3cf2f2e6a6a37158a3f16d34542b7203bcbb7697009833bd9af\": container with ID starting with 7511540248eed3cf2f2e6a6a37158a3f16d34542b7203bcbb7697009833bd9af not found: ID does not exist" containerID="7511540248eed3cf2f2e6a6a37158a3f16d34542b7203bcbb7697009833bd9af" Oct 03 14:46:51 crc kubenswrapper[4774]: I1003 14:46:51.173208 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7511540248eed3cf2f2e6a6a37158a3f16d34542b7203bcbb7697009833bd9af"} err="failed to get container status \"7511540248eed3cf2f2e6a6a37158a3f16d34542b7203bcbb7697009833bd9af\": rpc error: code = NotFound desc = could not find container \"7511540248eed3cf2f2e6a6a37158a3f16d34542b7203bcbb7697009833bd9af\": container with ID starting with 7511540248eed3cf2f2e6a6a37158a3f16d34542b7203bcbb7697009833bd9af not found: ID does not exist" Oct 03 14:46:51 crc kubenswrapper[4774]: I1003 14:46:51.173241 4774 scope.go:117] "RemoveContainer" containerID="7fd2445ccc390b5e5c51155b0065619344aca66ba63722d5a832408e1d5836da" Oct 03 14:46:51 crc kubenswrapper[4774]: E1003 14:46:51.173650 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fd2445ccc390b5e5c51155b0065619344aca66ba63722d5a832408e1d5836da\": container with ID starting with 7fd2445ccc390b5e5c51155b0065619344aca66ba63722d5a832408e1d5836da not found: ID does not exist" containerID="7fd2445ccc390b5e5c51155b0065619344aca66ba63722d5a832408e1d5836da" Oct 03 14:46:51 crc kubenswrapper[4774]: I1003 14:46:51.173689 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fd2445ccc390b5e5c51155b0065619344aca66ba63722d5a832408e1d5836da"} err="failed to get container status \"7fd2445ccc390b5e5c51155b0065619344aca66ba63722d5a832408e1d5836da\": rpc error: code = NotFound desc = could not find container \"7fd2445ccc390b5e5c51155b0065619344aca66ba63722d5a832408e1d5836da\": container with ID starting with 7fd2445ccc390b5e5c51155b0065619344aca66ba63722d5a832408e1d5836da not found: ID does not exist" Oct 03 14:46:51 crc kubenswrapper[4774]: I1003 14:46:51.233501 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea488297-1ed1-475d-8c6a-9b4881dd5583-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea488297-1ed1-475d-8c6a-9b4881dd5583" (UID: "ea488297-1ed1-475d-8c6a-9b4881dd5583"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:46:51 crc kubenswrapper[4774]: I1003 14:46:51.299921 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea488297-1ed1-475d-8c6a-9b4881dd5583-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:46:51 crc kubenswrapper[4774]: I1003 14:46:51.310031 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="082e67c6-a87d-48ea-90e3-e20178613597" path="/var/lib/kubelet/pods/082e67c6-a87d-48ea-90e3-e20178613597/volumes" Oct 03 14:46:51 crc kubenswrapper[4774]: I1003 14:46:51.311956 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dfe8164-582f-46df-a0a7-790c7df9e6f9" path="/var/lib/kubelet/pods/8dfe8164-582f-46df-a0a7-790c7df9e6f9/volumes" Oct 03 14:46:51 crc kubenswrapper[4774]: I1003 14:46:51.430937 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kj6jv"] Oct 03 14:46:51 crc kubenswrapper[4774]: I1003 14:46:51.434181 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kj6jv"] Oct 03 14:46:52 crc kubenswrapper[4774]: I1003 14:46:52.125303 4774 generic.go:334] "Generic (PLEG): container finished" podID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerID="bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055" exitCode=0 Oct 03 14:46:52 crc kubenswrapper[4774]: I1003 14:46:52.125396 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" event={"ID":"ca37ac4b-f421-4198-a179-12901d36f0f5","Type":"ContainerDied","Data":"bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055"} Oct 03 14:46:52 crc kubenswrapper[4774]: I1003 14:46:52.125678 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" event={"ID":"ca37ac4b-f421-4198-a179-12901d36f0f5","Type":"ContainerStarted","Data":"c5ad343ebc4fbb79d9c25d28242be0ed044b9a7ef63c6a844189fb94cda2175a"} Oct 03 14:46:53 crc kubenswrapper[4774]: I1003 14:46:53.306705 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea488297-1ed1-475d-8c6a-9b4881dd5583" path="/var/lib/kubelet/pods/ea488297-1ed1-475d-8c6a-9b4881dd5583/volumes" Oct 03 14:46:54 crc kubenswrapper[4774]: I1003 14:46:54.640298 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wbncg" Oct 03 14:46:56 crc kubenswrapper[4774]: I1003 14:46:56.592758 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hntbw" Oct 03 14:46:57 crc kubenswrapper[4774]: I1003 14:46:57.743612 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-p27cs"] Oct 03 14:46:58 crc kubenswrapper[4774]: I1003 14:46:58.267088 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wbncg"] Oct 03 14:46:58 crc kubenswrapper[4774]: I1003 14:46:58.267348 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wbncg" podUID="8656ee03-e27c-4ba6-a803-ab372ecb9b7b" containerName="registry-server" containerID="cri-o://0e030a7effaae6d6ae265a1cefddc9dfa03e397c913f1ed3b6480b89dcaeafb0" gracePeriod=2 Oct 03 14:46:58 crc kubenswrapper[4774]: I1003 14:46:58.619615 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wbncg" Oct 03 14:46:58 crc kubenswrapper[4774]: I1003 14:46:58.701661 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8656ee03-e27c-4ba6-a803-ab372ecb9b7b-catalog-content\") pod \"8656ee03-e27c-4ba6-a803-ab372ecb9b7b\" (UID: \"8656ee03-e27c-4ba6-a803-ab372ecb9b7b\") " Oct 03 14:46:58 crc kubenswrapper[4774]: I1003 14:46:58.701756 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8656ee03-e27c-4ba6-a803-ab372ecb9b7b-utilities\") pod \"8656ee03-e27c-4ba6-a803-ab372ecb9b7b\" (UID: \"8656ee03-e27c-4ba6-a803-ab372ecb9b7b\") " Oct 03 14:46:58 crc kubenswrapper[4774]: I1003 14:46:58.701815 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78rsr\" (UniqueName: \"kubernetes.io/projected/8656ee03-e27c-4ba6-a803-ab372ecb9b7b-kube-api-access-78rsr\") pod \"8656ee03-e27c-4ba6-a803-ab372ecb9b7b\" (UID: \"8656ee03-e27c-4ba6-a803-ab372ecb9b7b\") " Oct 03 14:46:58 crc kubenswrapper[4774]: I1003 14:46:58.708791 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8656ee03-e27c-4ba6-a803-ab372ecb9b7b-utilities" (OuterVolumeSpecName: "utilities") pod "8656ee03-e27c-4ba6-a803-ab372ecb9b7b" (UID: "8656ee03-e27c-4ba6-a803-ab372ecb9b7b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:46:58 crc kubenswrapper[4774]: I1003 14:46:58.709711 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8656ee03-e27c-4ba6-a803-ab372ecb9b7b-kube-api-access-78rsr" (OuterVolumeSpecName: "kube-api-access-78rsr") pod "8656ee03-e27c-4ba6-a803-ab372ecb9b7b" (UID: "8656ee03-e27c-4ba6-a803-ab372ecb9b7b"). InnerVolumeSpecName "kube-api-access-78rsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:46:58 crc kubenswrapper[4774]: I1003 14:46:58.760780 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8656ee03-e27c-4ba6-a803-ab372ecb9b7b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8656ee03-e27c-4ba6-a803-ab372ecb9b7b" (UID: "8656ee03-e27c-4ba6-a803-ab372ecb9b7b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:46:58 crc kubenswrapper[4774]: I1003 14:46:58.802959 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8656ee03-e27c-4ba6-a803-ab372ecb9b7b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:46:58 crc kubenswrapper[4774]: I1003 14:46:58.802997 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8656ee03-e27c-4ba6-a803-ab372ecb9b7b-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:46:58 crc kubenswrapper[4774]: I1003 14:46:58.803010 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78rsr\" (UniqueName: \"kubernetes.io/projected/8656ee03-e27c-4ba6-a803-ab372ecb9b7b-kube-api-access-78rsr\") on node \"crc\" DevicePath \"\"" Oct 03 14:46:59 crc kubenswrapper[4774]: I1003 14:46:59.166226 4774 generic.go:334] "Generic (PLEG): container finished" podID="8656ee03-e27c-4ba6-a803-ab372ecb9b7b" containerID="0e030a7effaae6d6ae265a1cefddc9dfa03e397c913f1ed3b6480b89dcaeafb0" exitCode=0 Oct 03 14:46:59 crc kubenswrapper[4774]: I1003 14:46:59.166261 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbncg" event={"ID":"8656ee03-e27c-4ba6-a803-ab372ecb9b7b","Type":"ContainerDied","Data":"0e030a7effaae6d6ae265a1cefddc9dfa03e397c913f1ed3b6480b89dcaeafb0"} Oct 03 14:46:59 crc kubenswrapper[4774]: I1003 14:46:59.166300 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbncg" event={"ID":"8656ee03-e27c-4ba6-a803-ab372ecb9b7b","Type":"ContainerDied","Data":"184ae1adfaf5cd185d9ee92ea6ccfc99fc82df45d4199b68d66b29d19493f3cf"} Oct 03 14:46:59 crc kubenswrapper[4774]: I1003 14:46:59.166321 4774 scope.go:117] "RemoveContainer" containerID="0e030a7effaae6d6ae265a1cefddc9dfa03e397c913f1ed3b6480b89dcaeafb0" Oct 03 14:46:59 crc kubenswrapper[4774]: I1003 14:46:59.166424 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wbncg" Oct 03 14:46:59 crc kubenswrapper[4774]: I1003 14:46:59.186483 4774 scope.go:117] "RemoveContainer" containerID="22b02c966c08ad3fa17ffa98ea3eec18da00a1616ca69d7d3d851e04e702ca19" Oct 03 14:46:59 crc kubenswrapper[4774]: I1003 14:46:59.195241 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wbncg"] Oct 03 14:46:59 crc kubenswrapper[4774]: I1003 14:46:59.196752 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wbncg"] Oct 03 14:46:59 crc kubenswrapper[4774]: I1003 14:46:59.203861 4774 scope.go:117] "RemoveContainer" containerID="1c61f1a30913bf289b784895fc7d6d1c332d634278a052bf41eb774ae6c8f60f" Oct 03 14:46:59 crc kubenswrapper[4774]: I1003 14:46:59.219589 4774 scope.go:117] "RemoveContainer" containerID="0e030a7effaae6d6ae265a1cefddc9dfa03e397c913f1ed3b6480b89dcaeafb0" Oct 03 14:46:59 crc kubenswrapper[4774]: E1003 14:46:59.220035 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e030a7effaae6d6ae265a1cefddc9dfa03e397c913f1ed3b6480b89dcaeafb0\": container with ID starting with 0e030a7effaae6d6ae265a1cefddc9dfa03e397c913f1ed3b6480b89dcaeafb0 not found: ID does not exist" containerID="0e030a7effaae6d6ae265a1cefddc9dfa03e397c913f1ed3b6480b89dcaeafb0" Oct 03 14:46:59 crc kubenswrapper[4774]: I1003 14:46:59.220083 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e030a7effaae6d6ae265a1cefddc9dfa03e397c913f1ed3b6480b89dcaeafb0"} err="failed to get container status \"0e030a7effaae6d6ae265a1cefddc9dfa03e397c913f1ed3b6480b89dcaeafb0\": rpc error: code = NotFound desc = could not find container \"0e030a7effaae6d6ae265a1cefddc9dfa03e397c913f1ed3b6480b89dcaeafb0\": container with ID starting with 0e030a7effaae6d6ae265a1cefddc9dfa03e397c913f1ed3b6480b89dcaeafb0 not found: ID does not exist" Oct 03 14:46:59 crc kubenswrapper[4774]: I1003 14:46:59.220116 4774 scope.go:117] "RemoveContainer" containerID="22b02c966c08ad3fa17ffa98ea3eec18da00a1616ca69d7d3d851e04e702ca19" Oct 03 14:46:59 crc kubenswrapper[4774]: E1003 14:46:59.220442 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22b02c966c08ad3fa17ffa98ea3eec18da00a1616ca69d7d3d851e04e702ca19\": container with ID starting with 22b02c966c08ad3fa17ffa98ea3eec18da00a1616ca69d7d3d851e04e702ca19 not found: ID does not exist" containerID="22b02c966c08ad3fa17ffa98ea3eec18da00a1616ca69d7d3d851e04e702ca19" Oct 03 14:46:59 crc kubenswrapper[4774]: I1003 14:46:59.220474 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22b02c966c08ad3fa17ffa98ea3eec18da00a1616ca69d7d3d851e04e702ca19"} err="failed to get container status \"22b02c966c08ad3fa17ffa98ea3eec18da00a1616ca69d7d3d851e04e702ca19\": rpc error: code = NotFound desc = could not find container \"22b02c966c08ad3fa17ffa98ea3eec18da00a1616ca69d7d3d851e04e702ca19\": container with ID starting with 22b02c966c08ad3fa17ffa98ea3eec18da00a1616ca69d7d3d851e04e702ca19 not found: ID does not exist" Oct 03 14:46:59 crc kubenswrapper[4774]: I1003 14:46:59.220497 4774 scope.go:117] "RemoveContainer" containerID="1c61f1a30913bf289b784895fc7d6d1c332d634278a052bf41eb774ae6c8f60f" Oct 03 14:46:59 crc kubenswrapper[4774]: E1003 14:46:59.220712 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c61f1a30913bf289b784895fc7d6d1c332d634278a052bf41eb774ae6c8f60f\": container with ID starting with 1c61f1a30913bf289b784895fc7d6d1c332d634278a052bf41eb774ae6c8f60f not found: ID does not exist" containerID="1c61f1a30913bf289b784895fc7d6d1c332d634278a052bf41eb774ae6c8f60f" Oct 03 14:46:59 crc kubenswrapper[4774]: I1003 14:46:59.220734 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c61f1a30913bf289b784895fc7d6d1c332d634278a052bf41eb774ae6c8f60f"} err="failed to get container status \"1c61f1a30913bf289b784895fc7d6d1c332d634278a052bf41eb774ae6c8f60f\": rpc error: code = NotFound desc = could not find container \"1c61f1a30913bf289b784895fc7d6d1c332d634278a052bf41eb774ae6c8f60f\": container with ID starting with 1c61f1a30913bf289b784895fc7d6d1c332d634278a052bf41eb774ae6c8f60f not found: ID does not exist" Oct 03 14:46:59 crc kubenswrapper[4774]: I1003 14:46:59.307618 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8656ee03-e27c-4ba6-a803-ab372ecb9b7b" path="/var/lib/kubelet/pods/8656ee03-e27c-4ba6-a803-ab372ecb9b7b/volumes" Oct 03 14:47:22 crc kubenswrapper[4774]: I1003 14:47:22.764059 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" podUID="aeba6428-0be6-4f7e-96f8-5b23fb08dd3b" containerName="oauth-openshift" containerID="cri-o://ea62533f404c55e689258825792b52a0d686c204a38958091b86cf6b6a3d50f9" gracePeriod=15 Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.121869 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.162886 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl"] Oct 03 14:47:23 crc kubenswrapper[4774]: E1003 14:47:23.163214 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="082e67c6-a87d-48ea-90e3-e20178613597" containerName="extract-utilities" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.163245 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="082e67c6-a87d-48ea-90e3-e20178613597" containerName="extract-utilities" Oct 03 14:47:23 crc kubenswrapper[4774]: E1003 14:47:23.163265 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dfe8164-582f-46df-a0a7-790c7df9e6f9" containerName="extract-utilities" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.163279 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dfe8164-582f-46df-a0a7-790c7df9e6f9" containerName="extract-utilities" Oct 03 14:47:23 crc kubenswrapper[4774]: E1003 14:47:23.163292 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="082e67c6-a87d-48ea-90e3-e20178613597" containerName="extract-content" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.163304 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="082e67c6-a87d-48ea-90e3-e20178613597" containerName="extract-content" Oct 03 14:47:23 crc kubenswrapper[4774]: E1003 14:47:23.163316 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8656ee03-e27c-4ba6-a803-ab372ecb9b7b" containerName="extract-utilities" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.163328 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="8656ee03-e27c-4ba6-a803-ab372ecb9b7b" containerName="extract-utilities" Oct 03 14:47:23 crc kubenswrapper[4774]: E1003 14:47:23.163338 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea488297-1ed1-475d-8c6a-9b4881dd5583" containerName="extract-content" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.163349 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea488297-1ed1-475d-8c6a-9b4881dd5583" containerName="extract-content" Oct 03 14:47:23 crc kubenswrapper[4774]: E1003 14:47:23.163361 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8656ee03-e27c-4ba6-a803-ab372ecb9b7b" containerName="registry-server" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.163392 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="8656ee03-e27c-4ba6-a803-ab372ecb9b7b" containerName="registry-server" Oct 03 14:47:23 crc kubenswrapper[4774]: E1003 14:47:23.163410 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="907dd334-f95b-417d-bbd7-402486f2fcff" containerName="pruner" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.163422 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="907dd334-f95b-417d-bbd7-402486f2fcff" containerName="pruner" Oct 03 14:47:23 crc kubenswrapper[4774]: E1003 14:47:23.163437 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="995e7b80-23e7-4f81-95b3-a3bc35a125ae" containerName="pruner" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.163446 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="995e7b80-23e7-4f81-95b3-a3bc35a125ae" containerName="pruner" Oct 03 14:47:23 crc kubenswrapper[4774]: E1003 14:47:23.163470 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8656ee03-e27c-4ba6-a803-ab372ecb9b7b" containerName="extract-content" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.163480 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="8656ee03-e27c-4ba6-a803-ab372ecb9b7b" containerName="extract-content" Oct 03 14:47:23 crc kubenswrapper[4774]: E1003 14:47:23.163495 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dfe8164-582f-46df-a0a7-790c7df9e6f9" containerName="extract-content" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.163506 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dfe8164-582f-46df-a0a7-790c7df9e6f9" containerName="extract-content" Oct 03 14:47:23 crc kubenswrapper[4774]: E1003 14:47:23.163517 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea488297-1ed1-475d-8c6a-9b4881dd5583" containerName="extract-utilities" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.163527 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea488297-1ed1-475d-8c6a-9b4881dd5583" containerName="extract-utilities" Oct 03 14:47:23 crc kubenswrapper[4774]: E1003 14:47:23.163538 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea488297-1ed1-475d-8c6a-9b4881dd5583" containerName="registry-server" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.163547 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea488297-1ed1-475d-8c6a-9b4881dd5583" containerName="registry-server" Oct 03 14:47:23 crc kubenswrapper[4774]: E1003 14:47:23.163561 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="082e67c6-a87d-48ea-90e3-e20178613597" containerName="registry-server" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.163571 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="082e67c6-a87d-48ea-90e3-e20178613597" containerName="registry-server" Oct 03 14:47:23 crc kubenswrapper[4774]: E1003 14:47:23.163589 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeba6428-0be6-4f7e-96f8-5b23fb08dd3b" containerName="oauth-openshift" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.163600 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeba6428-0be6-4f7e-96f8-5b23fb08dd3b" containerName="oauth-openshift" Oct 03 14:47:23 crc kubenswrapper[4774]: E1003 14:47:23.163613 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dfe8164-582f-46df-a0a7-790c7df9e6f9" containerName="registry-server" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.163624 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dfe8164-582f-46df-a0a7-790c7df9e6f9" containerName="registry-server" Oct 03 14:47:23 crc kubenswrapper[4774]: E1003 14:47:23.163637 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9" containerName="collect-profiles" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.163649 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9" containerName="collect-profiles" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.163790 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9" containerName="collect-profiles" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.163809 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea488297-1ed1-475d-8c6a-9b4881dd5583" containerName="registry-server" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.163821 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="995e7b80-23e7-4f81-95b3-a3bc35a125ae" containerName="pruner" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.163833 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeba6428-0be6-4f7e-96f8-5b23fb08dd3b" containerName="oauth-openshift" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.163853 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="082e67c6-a87d-48ea-90e3-e20178613597" containerName="registry-server" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.163863 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dfe8164-582f-46df-a0a7-790c7df9e6f9" containerName="registry-server" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.163878 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="8656ee03-e27c-4ba6-a803-ab372ecb9b7b" containerName="registry-server" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.163892 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="907dd334-f95b-417d-bbd7-402486f2fcff" containerName="pruner" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.166536 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.171840 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl"] Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.230426 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-user-template-login\") pod \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.230470 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-audit-dir\") pod \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.230492 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwprr\" (UniqueName: \"kubernetes.io/projected/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-kube-api-access-fwprr\") pod \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.230509 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-serving-cert\") pod \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.230535 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-ocp-branding-template\") pod \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.230553 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-cliconfig\") pod \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.230623 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-service-ca\") pod \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.230641 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-user-template-provider-selection\") pod \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.230659 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-router-certs\") pod \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.230675 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-audit-policies\") pod \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.230700 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-trusted-ca-bundle\") pod \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.230724 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-session\") pod \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.230749 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-user-template-error\") pod \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.230772 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-user-idp-0-file-data\") pod \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\" (UID: \"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b\") " Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.231554 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "aeba6428-0be6-4f7e-96f8-5b23fb08dd3b" (UID: "aeba6428-0be6-4f7e-96f8-5b23fb08dd3b"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.232128 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "aeba6428-0be6-4f7e-96f8-5b23fb08dd3b" (UID: "aeba6428-0be6-4f7e-96f8-5b23fb08dd3b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.232326 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "aeba6428-0be6-4f7e-96f8-5b23fb08dd3b" (UID: "aeba6428-0be6-4f7e-96f8-5b23fb08dd3b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.232413 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "aeba6428-0be6-4f7e-96f8-5b23fb08dd3b" (UID: "aeba6428-0be6-4f7e-96f8-5b23fb08dd3b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.232411 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "aeba6428-0be6-4f7e-96f8-5b23fb08dd3b" (UID: "aeba6428-0be6-4f7e-96f8-5b23fb08dd3b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.236255 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "aeba6428-0be6-4f7e-96f8-5b23fb08dd3b" (UID: "aeba6428-0be6-4f7e-96f8-5b23fb08dd3b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.237074 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "aeba6428-0be6-4f7e-96f8-5b23fb08dd3b" (UID: "aeba6428-0be6-4f7e-96f8-5b23fb08dd3b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.237401 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "aeba6428-0be6-4f7e-96f8-5b23fb08dd3b" (UID: "aeba6428-0be6-4f7e-96f8-5b23fb08dd3b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.237540 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-kube-api-access-fwprr" (OuterVolumeSpecName: "kube-api-access-fwprr") pod "aeba6428-0be6-4f7e-96f8-5b23fb08dd3b" (UID: "aeba6428-0be6-4f7e-96f8-5b23fb08dd3b"). InnerVolumeSpecName "kube-api-access-fwprr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.237824 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "aeba6428-0be6-4f7e-96f8-5b23fb08dd3b" (UID: "aeba6428-0be6-4f7e-96f8-5b23fb08dd3b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.237939 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "aeba6428-0be6-4f7e-96f8-5b23fb08dd3b" (UID: "aeba6428-0be6-4f7e-96f8-5b23fb08dd3b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.238197 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "aeba6428-0be6-4f7e-96f8-5b23fb08dd3b" (UID: "aeba6428-0be6-4f7e-96f8-5b23fb08dd3b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.238438 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "aeba6428-0be6-4f7e-96f8-5b23fb08dd3b" (UID: "aeba6428-0be6-4f7e-96f8-5b23fb08dd3b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.238530 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "aeba6428-0be6-4f7e-96f8-5b23fb08dd3b" (UID: "aeba6428-0be6-4f7e-96f8-5b23fb08dd3b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.296643 4774 generic.go:334] "Generic (PLEG): container finished" podID="aeba6428-0be6-4f7e-96f8-5b23fb08dd3b" containerID="ea62533f404c55e689258825792b52a0d686c204a38958091b86cf6b6a3d50f9" exitCode=0 Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.296692 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" event={"ID":"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b","Type":"ContainerDied","Data":"ea62533f404c55e689258825792b52a0d686c204a38958091b86cf6b6a3d50f9"} Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.296723 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" event={"ID":"aeba6428-0be6-4f7e-96f8-5b23fb08dd3b","Type":"ContainerDied","Data":"f7a9328a8ba2525175dcab54bbd33023d074dbf8754e228f73ef1b5cf08d6f98"} Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.296742 4774 scope.go:117] "RemoveContainer" containerID="ea62533f404c55e689258825792b52a0d686c204a38958091b86cf6b6a3d50f9" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.296740 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-p27cs" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.315132 4774 scope.go:117] "RemoveContainer" containerID="ea62533f404c55e689258825792b52a0d686c204a38958091b86cf6b6a3d50f9" Oct 03 14:47:23 crc kubenswrapper[4774]: E1003 14:47:23.316159 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea62533f404c55e689258825792b52a0d686c204a38958091b86cf6b6a3d50f9\": container with ID starting with ea62533f404c55e689258825792b52a0d686c204a38958091b86cf6b6a3d50f9 not found: ID does not exist" containerID="ea62533f404c55e689258825792b52a0d686c204a38958091b86cf6b6a3d50f9" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.316328 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea62533f404c55e689258825792b52a0d686c204a38958091b86cf6b6a3d50f9"} err="failed to get container status \"ea62533f404c55e689258825792b52a0d686c204a38958091b86cf6b6a3d50f9\": rpc error: code = NotFound desc = could not find container \"ea62533f404c55e689258825792b52a0d686c204a38958091b86cf6b6a3d50f9\": container with ID starting with ea62533f404c55e689258825792b52a0d686c204a38958091b86cf6b6a3d50f9 not found: ID does not exist" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.328486 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-p27cs"] Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.332331 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-v4-0-config-user-template-login\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.332390 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-v4-0-config-system-session\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.332409 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.332424 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.332450 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-audit-policies\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.332487 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.332572 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.332601 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-audit-dir\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.332618 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-v4-0-config-system-router-certs\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.332672 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.332709 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.332733 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-v4-0-config-system-service-ca\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.332876 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtb8s\" (UniqueName: \"kubernetes.io/projected/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-kube-api-access-qtb8s\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.333043 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-p27cs"] Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.333035 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-v4-0-config-user-template-error\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.333129 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.333145 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.333159 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.333320 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.333332 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.333343 4774 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.333354 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwprr\" (UniqueName: \"kubernetes.io/projected/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-kube-api-access-fwprr\") on node \"crc\" DevicePath \"\"" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.333364 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.333393 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.333403 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.333413 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.333424 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.333435 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.333444 4774 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.435185 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtb8s\" (UniqueName: \"kubernetes.io/projected/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-kube-api-access-qtb8s\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.435302 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-v4-0-config-user-template-error\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.435351 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-v4-0-config-user-template-login\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.435506 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-v4-0-config-system-session\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.435617 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.435678 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.435773 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-audit-policies\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.435811 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.435879 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.436861 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-audit-dir\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.437035 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-audit-dir\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.437104 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-v4-0-config-system-router-certs\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.437165 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-audit-policies\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.437177 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.437298 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.437335 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-v4-0-config-system-service-ca\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.438171 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-v4-0-config-system-service-ca\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.438653 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.438744 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.440707 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.440720 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-v4-0-config-user-template-error\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.441509 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-v4-0-config-user-template-login\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.441793 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.442536 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.442716 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-v4-0-config-system-router-certs\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.442967 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.443171 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-v4-0-config-system-session\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.452343 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtb8s\" (UniqueName: \"kubernetes.io/projected/5f6aa8b3-b336-4e2d-a937-512aac6c0f70-kube-api-access-qtb8s\") pod \"oauth-openshift-5f96bbd69c-4tqkl\" (UID: \"5f6aa8b3-b336-4e2d-a937-512aac6c0f70\") " pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.487678 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:23 crc kubenswrapper[4774]: I1003 14:47:23.876945 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl"] Oct 03 14:47:24 crc kubenswrapper[4774]: I1003 14:47:24.304259 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" event={"ID":"5f6aa8b3-b336-4e2d-a937-512aac6c0f70","Type":"ContainerStarted","Data":"618b0e5fae2297755c928f4d7025a397e94e348bfdb66d64e4a6d1474b2687f7"} Oct 03 14:47:24 crc kubenswrapper[4774]: I1003 14:47:24.304803 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:24 crc kubenswrapper[4774]: I1003 14:47:24.304906 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" event={"ID":"5f6aa8b3-b336-4e2d-a937-512aac6c0f70","Type":"ContainerStarted","Data":"80f83fe446dfa570da860f8cd43b3ca9be96dc010f84004636b99117a2646d67"} Oct 03 14:47:24 crc kubenswrapper[4774]: I1003 14:47:24.322901 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" podStartSLOduration=27.322885381 podStartE2EDuration="27.322885381s" podCreationTimestamp="2025-10-03 14:46:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:47:24.322110614 +0000 UTC m=+266.911314076" watchObservedRunningTime="2025-10-03 14:47:24.322885381 +0000 UTC m=+266.912088833" Oct 03 14:47:24 crc kubenswrapper[4774]: I1003 14:47:24.732128 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5f96bbd69c-4tqkl" Oct 03 14:47:25 crc kubenswrapper[4774]: I1003 14:47:25.307715 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeba6428-0be6-4f7e-96f8-5b23fb08dd3b" path="/var/lib/kubelet/pods/aeba6428-0be6-4f7e-96f8-5b23fb08dd3b/volumes" Oct 03 14:47:55 crc kubenswrapper[4774]: I1003 14:47:55.792352 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lvvp2"] Oct 03 14:47:55 crc kubenswrapper[4774]: I1003 14:47:55.793215 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lvvp2" podUID="e4a6d0e1-38db-42e1-8e29-b52de5bbad1d" containerName="registry-server" containerID="cri-o://10d22322ea6701471ae96fa67822eef255ecd138cab3a81c70d5dfdce2e58151" gracePeriod=30 Oct 03 14:47:55 crc kubenswrapper[4774]: I1003 14:47:55.798354 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2bx5m"] Oct 03 14:47:55 crc kubenswrapper[4774]: I1003 14:47:55.798615 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2bx5m" podUID="062191e1-9f34-4dba-bd1c-9bffe53f5cfd" containerName="registry-server" containerID="cri-o://1cc64d06fa501089dae4c94b6c5806c657e04d76020fda78779165c881d12b52" gracePeriod=30 Oct 03 14:47:55 crc kubenswrapper[4774]: I1003 14:47:55.817901 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wq2sd"] Oct 03 14:47:55 crc kubenswrapper[4774]: I1003 14:47:55.818220 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-wq2sd" podUID="923b3b0f-6810-4504-9757-fe4761f6ed37" containerName="marketplace-operator" containerID="cri-o://c733ea4a70192141edf13896a9e05ea86a2e061234cf9e5126aa9ee056a811a5" gracePeriod=30 Oct 03 14:47:55 crc kubenswrapper[4774]: I1003 14:47:55.835498 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hntbw"] Oct 03 14:47:55 crc kubenswrapper[4774]: I1003 14:47:55.836181 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hntbw" podUID="8922cd85-deea-4326-88fc-68c67debf56c" containerName="registry-server" containerID="cri-o://6b79066e99b1763fd752878b0f5e706ee85a60e59e05995ea2b0b37760923ae5" gracePeriod=30 Oct 03 14:47:55 crc kubenswrapper[4774]: I1003 14:47:55.847450 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p2hrr"] Oct 03 14:47:55 crc kubenswrapper[4774]: I1003 14:47:55.848339 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p2hrr" Oct 03 14:47:55 crc kubenswrapper[4774]: I1003 14:47:55.852473 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mg7ls"] Oct 03 14:47:55 crc kubenswrapper[4774]: I1003 14:47:55.852719 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mg7ls" podUID="2aad74fd-c975-42c4-8b05-ed28dbd55205" containerName="registry-server" containerID="cri-o://19ce9cd779ae623063419f702b4eac661c79d2dc10203bfa62c52c8e2683272f" gracePeriod=30 Oct 03 14:47:55 crc kubenswrapper[4774]: I1003 14:47:55.857525 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p2hrr"] Oct 03 14:47:55 crc kubenswrapper[4774]: I1003 14:47:55.895445 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3fd11a50-e44d-4d7f-b301-6c7069bf6096-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p2hrr\" (UID: \"3fd11a50-e44d-4d7f-b301-6c7069bf6096\") " pod="openshift-marketplace/marketplace-operator-79b997595-p2hrr" Oct 03 14:47:55 crc kubenswrapper[4774]: I1003 14:47:55.895502 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rnrb\" (UniqueName: \"kubernetes.io/projected/3fd11a50-e44d-4d7f-b301-6c7069bf6096-kube-api-access-7rnrb\") pod \"marketplace-operator-79b997595-p2hrr\" (UID: \"3fd11a50-e44d-4d7f-b301-6c7069bf6096\") " pod="openshift-marketplace/marketplace-operator-79b997595-p2hrr" Oct 03 14:47:55 crc kubenswrapper[4774]: I1003 14:47:55.895555 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3fd11a50-e44d-4d7f-b301-6c7069bf6096-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p2hrr\" (UID: \"3fd11a50-e44d-4d7f-b301-6c7069bf6096\") " pod="openshift-marketplace/marketplace-operator-79b997595-p2hrr" Oct 03 14:47:55 crc kubenswrapper[4774]: I1003 14:47:55.996135 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3fd11a50-e44d-4d7f-b301-6c7069bf6096-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p2hrr\" (UID: \"3fd11a50-e44d-4d7f-b301-6c7069bf6096\") " pod="openshift-marketplace/marketplace-operator-79b997595-p2hrr" Oct 03 14:47:55 crc kubenswrapper[4774]: I1003 14:47:55.996188 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3fd11a50-e44d-4d7f-b301-6c7069bf6096-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p2hrr\" (UID: \"3fd11a50-e44d-4d7f-b301-6c7069bf6096\") " pod="openshift-marketplace/marketplace-operator-79b997595-p2hrr" Oct 03 14:47:55 crc kubenswrapper[4774]: I1003 14:47:55.996218 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rnrb\" (UniqueName: \"kubernetes.io/projected/3fd11a50-e44d-4d7f-b301-6c7069bf6096-kube-api-access-7rnrb\") pod \"marketplace-operator-79b997595-p2hrr\" (UID: \"3fd11a50-e44d-4d7f-b301-6c7069bf6096\") " pod="openshift-marketplace/marketplace-operator-79b997595-p2hrr" Oct 03 14:47:55 crc kubenswrapper[4774]: I1003 14:47:55.998569 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3fd11a50-e44d-4d7f-b301-6c7069bf6096-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p2hrr\" (UID: \"3fd11a50-e44d-4d7f-b301-6c7069bf6096\") " pod="openshift-marketplace/marketplace-operator-79b997595-p2hrr" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.006213 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3fd11a50-e44d-4d7f-b301-6c7069bf6096-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p2hrr\" (UID: \"3fd11a50-e44d-4d7f-b301-6c7069bf6096\") " pod="openshift-marketplace/marketplace-operator-79b997595-p2hrr" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.014628 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rnrb\" (UniqueName: \"kubernetes.io/projected/3fd11a50-e44d-4d7f-b301-6c7069bf6096-kube-api-access-7rnrb\") pod \"marketplace-operator-79b997595-p2hrr\" (UID: \"3fd11a50-e44d-4d7f-b301-6c7069bf6096\") " pod="openshift-marketplace/marketplace-operator-79b997595-p2hrr" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.216888 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p2hrr" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.223379 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2bx5m" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.233062 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wq2sd" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.234978 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lvvp2" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.281093 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mg7ls" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.306229 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hntbw" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.408076 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4a6d0e1-38db-42e1-8e29-b52de5bbad1d-utilities\") pod \"e4a6d0e1-38db-42e1-8e29-b52de5bbad1d\" (UID: \"e4a6d0e1-38db-42e1-8e29-b52de5bbad1d\") " Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.408334 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwp6h\" (UniqueName: \"kubernetes.io/projected/2aad74fd-c975-42c4-8b05-ed28dbd55205-kube-api-access-cwp6h\") pod \"2aad74fd-c975-42c4-8b05-ed28dbd55205\" (UID: \"2aad74fd-c975-42c4-8b05-ed28dbd55205\") " Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.408354 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4a6d0e1-38db-42e1-8e29-b52de5bbad1d-catalog-content\") pod \"e4a6d0e1-38db-42e1-8e29-b52de5bbad1d\" (UID: \"e4a6d0e1-38db-42e1-8e29-b52de5bbad1d\") " Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.408388 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/923b3b0f-6810-4504-9757-fe4761f6ed37-marketplace-trusted-ca\") pod \"923b3b0f-6810-4504-9757-fe4761f6ed37\" (UID: \"923b3b0f-6810-4504-9757-fe4761f6ed37\") " Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.408420 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aad74fd-c975-42c4-8b05-ed28dbd55205-utilities\") pod \"2aad74fd-c975-42c4-8b05-ed28dbd55205\" (UID: \"2aad74fd-c975-42c4-8b05-ed28dbd55205\") " Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.408436 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/062191e1-9f34-4dba-bd1c-9bffe53f5cfd-catalog-content\") pod \"062191e1-9f34-4dba-bd1c-9bffe53f5cfd\" (UID: \"062191e1-9f34-4dba-bd1c-9bffe53f5cfd\") " Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.408459 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7rtc\" (UniqueName: \"kubernetes.io/projected/923b3b0f-6810-4504-9757-fe4761f6ed37-kube-api-access-d7rtc\") pod \"923b3b0f-6810-4504-9757-fe4761f6ed37\" (UID: \"923b3b0f-6810-4504-9757-fe4761f6ed37\") " Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.408481 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/062191e1-9f34-4dba-bd1c-9bffe53f5cfd-utilities\") pod \"062191e1-9f34-4dba-bd1c-9bffe53f5cfd\" (UID: \"062191e1-9f34-4dba-bd1c-9bffe53f5cfd\") " Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.408505 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/923b3b0f-6810-4504-9757-fe4761f6ed37-marketplace-operator-metrics\") pod \"923b3b0f-6810-4504-9757-fe4761f6ed37\" (UID: \"923b3b0f-6810-4504-9757-fe4761f6ed37\") " Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.408526 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dswf2\" (UniqueName: \"kubernetes.io/projected/e4a6d0e1-38db-42e1-8e29-b52de5bbad1d-kube-api-access-dswf2\") pod \"e4a6d0e1-38db-42e1-8e29-b52de5bbad1d\" (UID: \"e4a6d0e1-38db-42e1-8e29-b52de5bbad1d\") " Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.408541 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8922cd85-deea-4326-88fc-68c67debf56c-catalog-content\") pod \"8922cd85-deea-4326-88fc-68c67debf56c\" (UID: \"8922cd85-deea-4326-88fc-68c67debf56c\") " Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.408558 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9j9pm\" (UniqueName: \"kubernetes.io/projected/8922cd85-deea-4326-88fc-68c67debf56c-kube-api-access-9j9pm\") pod \"8922cd85-deea-4326-88fc-68c67debf56c\" (UID: \"8922cd85-deea-4326-88fc-68c67debf56c\") " Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.408581 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8922cd85-deea-4326-88fc-68c67debf56c-utilities\") pod \"8922cd85-deea-4326-88fc-68c67debf56c\" (UID: \"8922cd85-deea-4326-88fc-68c67debf56c\") " Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.408604 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aad74fd-c975-42c4-8b05-ed28dbd55205-catalog-content\") pod \"2aad74fd-c975-42c4-8b05-ed28dbd55205\" (UID: \"2aad74fd-c975-42c4-8b05-ed28dbd55205\") " Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.408623 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndlls\" (UniqueName: \"kubernetes.io/projected/062191e1-9f34-4dba-bd1c-9bffe53f5cfd-kube-api-access-ndlls\") pod \"062191e1-9f34-4dba-bd1c-9bffe53f5cfd\" (UID: \"062191e1-9f34-4dba-bd1c-9bffe53f5cfd\") " Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.409151 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4a6d0e1-38db-42e1-8e29-b52de5bbad1d-utilities" (OuterVolumeSpecName: "utilities") pod "e4a6d0e1-38db-42e1-8e29-b52de5bbad1d" (UID: "e4a6d0e1-38db-42e1-8e29-b52de5bbad1d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.409505 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/062191e1-9f34-4dba-bd1c-9bffe53f5cfd-utilities" (OuterVolumeSpecName: "utilities") pod "062191e1-9f34-4dba-bd1c-9bffe53f5cfd" (UID: "062191e1-9f34-4dba-bd1c-9bffe53f5cfd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.409777 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8922cd85-deea-4326-88fc-68c67debf56c-utilities" (OuterVolumeSpecName: "utilities") pod "8922cd85-deea-4326-88fc-68c67debf56c" (UID: "8922cd85-deea-4326-88fc-68c67debf56c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.410347 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2aad74fd-c975-42c4-8b05-ed28dbd55205-utilities" (OuterVolumeSpecName: "utilities") pod "2aad74fd-c975-42c4-8b05-ed28dbd55205" (UID: "2aad74fd-c975-42c4-8b05-ed28dbd55205"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.410385 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/923b3b0f-6810-4504-9757-fe4761f6ed37-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "923b3b0f-6810-4504-9757-fe4761f6ed37" (UID: "923b3b0f-6810-4504-9757-fe4761f6ed37"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.412690 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4a6d0e1-38db-42e1-8e29-b52de5bbad1d-kube-api-access-dswf2" (OuterVolumeSpecName: "kube-api-access-dswf2") pod "e4a6d0e1-38db-42e1-8e29-b52de5bbad1d" (UID: "e4a6d0e1-38db-42e1-8e29-b52de5bbad1d"). InnerVolumeSpecName "kube-api-access-dswf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.412821 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/923b3b0f-6810-4504-9757-fe4761f6ed37-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "923b3b0f-6810-4504-9757-fe4761f6ed37" (UID: "923b3b0f-6810-4504-9757-fe4761f6ed37"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.412867 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/923b3b0f-6810-4504-9757-fe4761f6ed37-kube-api-access-d7rtc" (OuterVolumeSpecName: "kube-api-access-d7rtc") pod "923b3b0f-6810-4504-9757-fe4761f6ed37" (UID: "923b3b0f-6810-4504-9757-fe4761f6ed37"). InnerVolumeSpecName "kube-api-access-d7rtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.413551 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8922cd85-deea-4326-88fc-68c67debf56c-kube-api-access-9j9pm" (OuterVolumeSpecName: "kube-api-access-9j9pm") pod "8922cd85-deea-4326-88fc-68c67debf56c" (UID: "8922cd85-deea-4326-88fc-68c67debf56c"). InnerVolumeSpecName "kube-api-access-9j9pm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.414120 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aad74fd-c975-42c4-8b05-ed28dbd55205-kube-api-access-cwp6h" (OuterVolumeSpecName: "kube-api-access-cwp6h") pod "2aad74fd-c975-42c4-8b05-ed28dbd55205" (UID: "2aad74fd-c975-42c4-8b05-ed28dbd55205"). InnerVolumeSpecName "kube-api-access-cwp6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.415311 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/062191e1-9f34-4dba-bd1c-9bffe53f5cfd-kube-api-access-ndlls" (OuterVolumeSpecName: "kube-api-access-ndlls") pod "062191e1-9f34-4dba-bd1c-9bffe53f5cfd" (UID: "062191e1-9f34-4dba-bd1c-9bffe53f5cfd"). InnerVolumeSpecName "kube-api-access-ndlls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.430421 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8922cd85-deea-4326-88fc-68c67debf56c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8922cd85-deea-4326-88fc-68c67debf56c" (UID: "8922cd85-deea-4326-88fc-68c67debf56c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.458200 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4a6d0e1-38db-42e1-8e29-b52de5bbad1d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e4a6d0e1-38db-42e1-8e29-b52de5bbad1d" (UID: "e4a6d0e1-38db-42e1-8e29-b52de5bbad1d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.476713 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p2hrr"] Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.479206 4774 generic.go:334] "Generic (PLEG): container finished" podID="923b3b0f-6810-4504-9757-fe4761f6ed37" containerID="c733ea4a70192141edf13896a9e05ea86a2e061234cf9e5126aa9ee056a811a5" exitCode=0 Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.479350 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wq2sd" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.480998 4774 generic.go:334] "Generic (PLEG): container finished" podID="062191e1-9f34-4dba-bd1c-9bffe53f5cfd" containerID="1cc64d06fa501089dae4c94b6c5806c657e04d76020fda78779165c881d12b52" exitCode=0 Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.481126 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2bx5m" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.481465 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wq2sd" event={"ID":"923b3b0f-6810-4504-9757-fe4761f6ed37","Type":"ContainerDied","Data":"c733ea4a70192141edf13896a9e05ea86a2e061234cf9e5126aa9ee056a811a5"} Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.481536 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wq2sd" event={"ID":"923b3b0f-6810-4504-9757-fe4761f6ed37","Type":"ContainerDied","Data":"4f4e8f93e0f851ea4d6a2cac331d092993d988b18c5dd9d61d5ed1d32d8c1a14"} Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.481552 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2bx5m" event={"ID":"062191e1-9f34-4dba-bd1c-9bffe53f5cfd","Type":"ContainerDied","Data":"1cc64d06fa501089dae4c94b6c5806c657e04d76020fda78779165c881d12b52"} Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.481566 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2bx5m" event={"ID":"062191e1-9f34-4dba-bd1c-9bffe53f5cfd","Type":"ContainerDied","Data":"97220c6cc1c72285ae50a28f00349fdf7be9db99f94c663a054f070dfdc4ed87"} Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.481610 4774 scope.go:117] "RemoveContainer" containerID="c733ea4a70192141edf13896a9e05ea86a2e061234cf9e5126aa9ee056a811a5" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.485046 4774 generic.go:334] "Generic (PLEG): container finished" podID="8922cd85-deea-4326-88fc-68c67debf56c" containerID="6b79066e99b1763fd752878b0f5e706ee85a60e59e05995ea2b0b37760923ae5" exitCode=0 Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.485238 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hntbw" event={"ID":"8922cd85-deea-4326-88fc-68c67debf56c","Type":"ContainerDied","Data":"6b79066e99b1763fd752878b0f5e706ee85a60e59e05995ea2b0b37760923ae5"} Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.485293 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hntbw" event={"ID":"8922cd85-deea-4326-88fc-68c67debf56c","Type":"ContainerDied","Data":"0725ef5ae026f1a72a3f32f252c278772d86625ece750bcd771ec927412cff13"} Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.485713 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hntbw" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.490912 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/062191e1-9f34-4dba-bd1c-9bffe53f5cfd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "062191e1-9f34-4dba-bd1c-9bffe53f5cfd" (UID: "062191e1-9f34-4dba-bd1c-9bffe53f5cfd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.494453 4774 generic.go:334] "Generic (PLEG): container finished" podID="e4a6d0e1-38db-42e1-8e29-b52de5bbad1d" containerID="10d22322ea6701471ae96fa67822eef255ecd138cab3a81c70d5dfdce2e58151" exitCode=0 Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.494510 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lvvp2" event={"ID":"e4a6d0e1-38db-42e1-8e29-b52de5bbad1d","Type":"ContainerDied","Data":"10d22322ea6701471ae96fa67822eef255ecd138cab3a81c70d5dfdce2e58151"} Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.494533 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lvvp2" event={"ID":"e4a6d0e1-38db-42e1-8e29-b52de5bbad1d","Type":"ContainerDied","Data":"021f06d1ec720728ea848cd7fb3d89d12fd0ebdb382e0340fcf99bab1045d1d3"} Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.494534 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lvvp2" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.497815 4774 generic.go:334] "Generic (PLEG): container finished" podID="2aad74fd-c975-42c4-8b05-ed28dbd55205" containerID="19ce9cd779ae623063419f702b4eac661c79d2dc10203bfa62c52c8e2683272f" exitCode=0 Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.497837 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mg7ls" event={"ID":"2aad74fd-c975-42c4-8b05-ed28dbd55205","Type":"ContainerDied","Data":"19ce9cd779ae623063419f702b4eac661c79d2dc10203bfa62c52c8e2683272f"} Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.497854 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mg7ls" event={"ID":"2aad74fd-c975-42c4-8b05-ed28dbd55205","Type":"ContainerDied","Data":"3a1f1daa166150bf27f3602be244016c227bc44d51c375a5ba04c569a732e701"} Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.497890 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mg7ls" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.514701 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4a6d0e1-38db-42e1-8e29-b52de5bbad1d-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.514737 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwp6h\" (UniqueName: \"kubernetes.io/projected/2aad74fd-c975-42c4-8b05-ed28dbd55205-kube-api-access-cwp6h\") on node \"crc\" DevicePath \"\"" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.514747 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4a6d0e1-38db-42e1-8e29-b52de5bbad1d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.514756 4774 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/923b3b0f-6810-4504-9757-fe4761f6ed37-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.514764 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aad74fd-c975-42c4-8b05-ed28dbd55205-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.514774 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/062191e1-9f34-4dba-bd1c-9bffe53f5cfd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.514789 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7rtc\" (UniqueName: \"kubernetes.io/projected/923b3b0f-6810-4504-9757-fe4761f6ed37-kube-api-access-d7rtc\") on node \"crc\" DevicePath \"\"" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.514803 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/062191e1-9f34-4dba-bd1c-9bffe53f5cfd-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.514816 4774 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/923b3b0f-6810-4504-9757-fe4761f6ed37-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.514829 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8922cd85-deea-4326-88fc-68c67debf56c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.514838 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dswf2\" (UniqueName: \"kubernetes.io/projected/e4a6d0e1-38db-42e1-8e29-b52de5bbad1d-kube-api-access-dswf2\") on node \"crc\" DevicePath \"\"" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.514851 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9j9pm\" (UniqueName: \"kubernetes.io/projected/8922cd85-deea-4326-88fc-68c67debf56c-kube-api-access-9j9pm\") on node \"crc\" DevicePath \"\"" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.514868 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8922cd85-deea-4326-88fc-68c67debf56c-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.514880 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndlls\" (UniqueName: \"kubernetes.io/projected/062191e1-9f34-4dba-bd1c-9bffe53f5cfd-kube-api-access-ndlls\") on node \"crc\" DevicePath \"\"" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.515468 4774 scope.go:117] "RemoveContainer" containerID="c733ea4a70192141edf13896a9e05ea86a2e061234cf9e5126aa9ee056a811a5" Oct 03 14:47:56 crc kubenswrapper[4774]: E1003 14:47:56.515931 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c733ea4a70192141edf13896a9e05ea86a2e061234cf9e5126aa9ee056a811a5\": container with ID starting with c733ea4a70192141edf13896a9e05ea86a2e061234cf9e5126aa9ee056a811a5 not found: ID does not exist" containerID="c733ea4a70192141edf13896a9e05ea86a2e061234cf9e5126aa9ee056a811a5" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.515966 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c733ea4a70192141edf13896a9e05ea86a2e061234cf9e5126aa9ee056a811a5"} err="failed to get container status \"c733ea4a70192141edf13896a9e05ea86a2e061234cf9e5126aa9ee056a811a5\": rpc error: code = NotFound desc = could not find container \"c733ea4a70192141edf13896a9e05ea86a2e061234cf9e5126aa9ee056a811a5\": container with ID starting with c733ea4a70192141edf13896a9e05ea86a2e061234cf9e5126aa9ee056a811a5 not found: ID does not exist" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.515992 4774 scope.go:117] "RemoveContainer" containerID="1cc64d06fa501089dae4c94b6c5806c657e04d76020fda78779165c881d12b52" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.519311 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wq2sd"] Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.522209 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wq2sd"] Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.532973 4774 scope.go:117] "RemoveContainer" containerID="6e6f5e7b98a3f52922cb4a4e351aae146118381748560093cc1ff42558d528b3" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.541116 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lvvp2"] Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.544928 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lvvp2"] Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.547018 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2aad74fd-c975-42c4-8b05-ed28dbd55205-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2aad74fd-c975-42c4-8b05-ed28dbd55205" (UID: "2aad74fd-c975-42c4-8b05-ed28dbd55205"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.552123 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hntbw"] Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.554989 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hntbw"] Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.557552 4774 scope.go:117] "RemoveContainer" containerID="0d45cb133189514f14e10e01b4d32fa37359727eb1ea83fa484e8e7e9ba3a9aa" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.571172 4774 scope.go:117] "RemoveContainer" containerID="1cc64d06fa501089dae4c94b6c5806c657e04d76020fda78779165c881d12b52" Oct 03 14:47:56 crc kubenswrapper[4774]: E1003 14:47:56.574502 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cc64d06fa501089dae4c94b6c5806c657e04d76020fda78779165c881d12b52\": container with ID starting with 1cc64d06fa501089dae4c94b6c5806c657e04d76020fda78779165c881d12b52 not found: ID does not exist" containerID="1cc64d06fa501089dae4c94b6c5806c657e04d76020fda78779165c881d12b52" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.574535 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cc64d06fa501089dae4c94b6c5806c657e04d76020fda78779165c881d12b52"} err="failed to get container status \"1cc64d06fa501089dae4c94b6c5806c657e04d76020fda78779165c881d12b52\": rpc error: code = NotFound desc = could not find container \"1cc64d06fa501089dae4c94b6c5806c657e04d76020fda78779165c881d12b52\": container with ID starting with 1cc64d06fa501089dae4c94b6c5806c657e04d76020fda78779165c881d12b52 not found: ID does not exist" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.574558 4774 scope.go:117] "RemoveContainer" containerID="6e6f5e7b98a3f52922cb4a4e351aae146118381748560093cc1ff42558d528b3" Oct 03 14:47:56 crc kubenswrapper[4774]: E1003 14:47:56.574856 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e6f5e7b98a3f52922cb4a4e351aae146118381748560093cc1ff42558d528b3\": container with ID starting with 6e6f5e7b98a3f52922cb4a4e351aae146118381748560093cc1ff42558d528b3 not found: ID does not exist" containerID="6e6f5e7b98a3f52922cb4a4e351aae146118381748560093cc1ff42558d528b3" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.574876 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e6f5e7b98a3f52922cb4a4e351aae146118381748560093cc1ff42558d528b3"} err="failed to get container status \"6e6f5e7b98a3f52922cb4a4e351aae146118381748560093cc1ff42558d528b3\": rpc error: code = NotFound desc = could not find container \"6e6f5e7b98a3f52922cb4a4e351aae146118381748560093cc1ff42558d528b3\": container with ID starting with 6e6f5e7b98a3f52922cb4a4e351aae146118381748560093cc1ff42558d528b3 not found: ID does not exist" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.574890 4774 scope.go:117] "RemoveContainer" containerID="0d45cb133189514f14e10e01b4d32fa37359727eb1ea83fa484e8e7e9ba3a9aa" Oct 03 14:47:56 crc kubenswrapper[4774]: E1003 14:47:56.575234 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d45cb133189514f14e10e01b4d32fa37359727eb1ea83fa484e8e7e9ba3a9aa\": container with ID starting with 0d45cb133189514f14e10e01b4d32fa37359727eb1ea83fa484e8e7e9ba3a9aa not found: ID does not exist" containerID="0d45cb133189514f14e10e01b4d32fa37359727eb1ea83fa484e8e7e9ba3a9aa" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.575265 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d45cb133189514f14e10e01b4d32fa37359727eb1ea83fa484e8e7e9ba3a9aa"} err="failed to get container status \"0d45cb133189514f14e10e01b4d32fa37359727eb1ea83fa484e8e7e9ba3a9aa\": rpc error: code = NotFound desc = could not find container \"0d45cb133189514f14e10e01b4d32fa37359727eb1ea83fa484e8e7e9ba3a9aa\": container with ID starting with 0d45cb133189514f14e10e01b4d32fa37359727eb1ea83fa484e8e7e9ba3a9aa not found: ID does not exist" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.575287 4774 scope.go:117] "RemoveContainer" containerID="6b79066e99b1763fd752878b0f5e706ee85a60e59e05995ea2b0b37760923ae5" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.588093 4774 scope.go:117] "RemoveContainer" containerID="0d64e788760602d93c85ee1d770d03e0f304106d4749502c1b9a6e5a1936dfd6" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.599517 4774 scope.go:117] "RemoveContainer" containerID="aef3313280418fbfc839cabcaf85c5e7bf37d120e395b981b70393f30dc222ba" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.611904 4774 scope.go:117] "RemoveContainer" containerID="6b79066e99b1763fd752878b0f5e706ee85a60e59e05995ea2b0b37760923ae5" Oct 03 14:47:56 crc kubenswrapper[4774]: E1003 14:47:56.612327 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b79066e99b1763fd752878b0f5e706ee85a60e59e05995ea2b0b37760923ae5\": container with ID starting with 6b79066e99b1763fd752878b0f5e706ee85a60e59e05995ea2b0b37760923ae5 not found: ID does not exist" containerID="6b79066e99b1763fd752878b0f5e706ee85a60e59e05995ea2b0b37760923ae5" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.612430 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b79066e99b1763fd752878b0f5e706ee85a60e59e05995ea2b0b37760923ae5"} err="failed to get container status \"6b79066e99b1763fd752878b0f5e706ee85a60e59e05995ea2b0b37760923ae5\": rpc error: code = NotFound desc = could not find container \"6b79066e99b1763fd752878b0f5e706ee85a60e59e05995ea2b0b37760923ae5\": container with ID starting with 6b79066e99b1763fd752878b0f5e706ee85a60e59e05995ea2b0b37760923ae5 not found: ID does not exist" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.612485 4774 scope.go:117] "RemoveContainer" containerID="0d64e788760602d93c85ee1d770d03e0f304106d4749502c1b9a6e5a1936dfd6" Oct 03 14:47:56 crc kubenswrapper[4774]: E1003 14:47:56.612817 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d64e788760602d93c85ee1d770d03e0f304106d4749502c1b9a6e5a1936dfd6\": container with ID starting with 0d64e788760602d93c85ee1d770d03e0f304106d4749502c1b9a6e5a1936dfd6 not found: ID does not exist" containerID="0d64e788760602d93c85ee1d770d03e0f304106d4749502c1b9a6e5a1936dfd6" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.612847 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d64e788760602d93c85ee1d770d03e0f304106d4749502c1b9a6e5a1936dfd6"} err="failed to get container status \"0d64e788760602d93c85ee1d770d03e0f304106d4749502c1b9a6e5a1936dfd6\": rpc error: code = NotFound desc = could not find container \"0d64e788760602d93c85ee1d770d03e0f304106d4749502c1b9a6e5a1936dfd6\": container with ID starting with 0d64e788760602d93c85ee1d770d03e0f304106d4749502c1b9a6e5a1936dfd6 not found: ID does not exist" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.612866 4774 scope.go:117] "RemoveContainer" containerID="aef3313280418fbfc839cabcaf85c5e7bf37d120e395b981b70393f30dc222ba" Oct 03 14:47:56 crc kubenswrapper[4774]: E1003 14:47:56.613279 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aef3313280418fbfc839cabcaf85c5e7bf37d120e395b981b70393f30dc222ba\": container with ID starting with aef3313280418fbfc839cabcaf85c5e7bf37d120e395b981b70393f30dc222ba not found: ID does not exist" containerID="aef3313280418fbfc839cabcaf85c5e7bf37d120e395b981b70393f30dc222ba" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.613333 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aef3313280418fbfc839cabcaf85c5e7bf37d120e395b981b70393f30dc222ba"} err="failed to get container status \"aef3313280418fbfc839cabcaf85c5e7bf37d120e395b981b70393f30dc222ba\": rpc error: code = NotFound desc = could not find container \"aef3313280418fbfc839cabcaf85c5e7bf37d120e395b981b70393f30dc222ba\": container with ID starting with aef3313280418fbfc839cabcaf85c5e7bf37d120e395b981b70393f30dc222ba not found: ID does not exist" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.613351 4774 scope.go:117] "RemoveContainer" containerID="10d22322ea6701471ae96fa67822eef255ecd138cab3a81c70d5dfdce2e58151" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.615588 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aad74fd-c975-42c4-8b05-ed28dbd55205-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.627230 4774 scope.go:117] "RemoveContainer" containerID="0dce622d016960e7f0e1296173f249ef8cc2d5599efc47aae75a04f9f80fe587" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.643890 4774 scope.go:117] "RemoveContainer" containerID="0a540d82efa8c4b60873f06a9e31ed8a7fd13e0d204f34e06e5f399194d9c554" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.657144 4774 scope.go:117] "RemoveContainer" containerID="10d22322ea6701471ae96fa67822eef255ecd138cab3a81c70d5dfdce2e58151" Oct 03 14:47:56 crc kubenswrapper[4774]: E1003 14:47:56.658728 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10d22322ea6701471ae96fa67822eef255ecd138cab3a81c70d5dfdce2e58151\": container with ID starting with 10d22322ea6701471ae96fa67822eef255ecd138cab3a81c70d5dfdce2e58151 not found: ID does not exist" containerID="10d22322ea6701471ae96fa67822eef255ecd138cab3a81c70d5dfdce2e58151" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.658767 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10d22322ea6701471ae96fa67822eef255ecd138cab3a81c70d5dfdce2e58151"} err="failed to get container status \"10d22322ea6701471ae96fa67822eef255ecd138cab3a81c70d5dfdce2e58151\": rpc error: code = NotFound desc = could not find container \"10d22322ea6701471ae96fa67822eef255ecd138cab3a81c70d5dfdce2e58151\": container with ID starting with 10d22322ea6701471ae96fa67822eef255ecd138cab3a81c70d5dfdce2e58151 not found: ID does not exist" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.658795 4774 scope.go:117] "RemoveContainer" containerID="0dce622d016960e7f0e1296173f249ef8cc2d5599efc47aae75a04f9f80fe587" Oct 03 14:47:56 crc kubenswrapper[4774]: E1003 14:47:56.659107 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dce622d016960e7f0e1296173f249ef8cc2d5599efc47aae75a04f9f80fe587\": container with ID starting with 0dce622d016960e7f0e1296173f249ef8cc2d5599efc47aae75a04f9f80fe587 not found: ID does not exist" containerID="0dce622d016960e7f0e1296173f249ef8cc2d5599efc47aae75a04f9f80fe587" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.659165 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dce622d016960e7f0e1296173f249ef8cc2d5599efc47aae75a04f9f80fe587"} err="failed to get container status \"0dce622d016960e7f0e1296173f249ef8cc2d5599efc47aae75a04f9f80fe587\": rpc error: code = NotFound desc = could not find container \"0dce622d016960e7f0e1296173f249ef8cc2d5599efc47aae75a04f9f80fe587\": container with ID starting with 0dce622d016960e7f0e1296173f249ef8cc2d5599efc47aae75a04f9f80fe587 not found: ID does not exist" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.659199 4774 scope.go:117] "RemoveContainer" containerID="0a540d82efa8c4b60873f06a9e31ed8a7fd13e0d204f34e06e5f399194d9c554" Oct 03 14:47:56 crc kubenswrapper[4774]: E1003 14:47:56.659511 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a540d82efa8c4b60873f06a9e31ed8a7fd13e0d204f34e06e5f399194d9c554\": container with ID starting with 0a540d82efa8c4b60873f06a9e31ed8a7fd13e0d204f34e06e5f399194d9c554 not found: ID does not exist" containerID="0a540d82efa8c4b60873f06a9e31ed8a7fd13e0d204f34e06e5f399194d9c554" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.659541 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a540d82efa8c4b60873f06a9e31ed8a7fd13e0d204f34e06e5f399194d9c554"} err="failed to get container status \"0a540d82efa8c4b60873f06a9e31ed8a7fd13e0d204f34e06e5f399194d9c554\": rpc error: code = NotFound desc = could not find container \"0a540d82efa8c4b60873f06a9e31ed8a7fd13e0d204f34e06e5f399194d9c554\": container with ID starting with 0a540d82efa8c4b60873f06a9e31ed8a7fd13e0d204f34e06e5f399194d9c554 not found: ID does not exist" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.659555 4774 scope.go:117] "RemoveContainer" containerID="19ce9cd779ae623063419f702b4eac661c79d2dc10203bfa62c52c8e2683272f" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.671801 4774 scope.go:117] "RemoveContainer" containerID="b223d514ad97cb3e3ca217b60e774a83b4505e236401d379396d7f9cbe15237c" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.683540 4774 scope.go:117] "RemoveContainer" containerID="241e7b66b12fb0ab7a133a4e3394b1e88df7231028e415a59af033fe842d0eae" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.696310 4774 scope.go:117] "RemoveContainer" containerID="19ce9cd779ae623063419f702b4eac661c79d2dc10203bfa62c52c8e2683272f" Oct 03 14:47:56 crc kubenswrapper[4774]: E1003 14:47:56.696832 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19ce9cd779ae623063419f702b4eac661c79d2dc10203bfa62c52c8e2683272f\": container with ID starting with 19ce9cd779ae623063419f702b4eac661c79d2dc10203bfa62c52c8e2683272f not found: ID does not exist" containerID="19ce9cd779ae623063419f702b4eac661c79d2dc10203bfa62c52c8e2683272f" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.696875 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19ce9cd779ae623063419f702b4eac661c79d2dc10203bfa62c52c8e2683272f"} err="failed to get container status \"19ce9cd779ae623063419f702b4eac661c79d2dc10203bfa62c52c8e2683272f\": rpc error: code = NotFound desc = could not find container \"19ce9cd779ae623063419f702b4eac661c79d2dc10203bfa62c52c8e2683272f\": container with ID starting with 19ce9cd779ae623063419f702b4eac661c79d2dc10203bfa62c52c8e2683272f not found: ID does not exist" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.696902 4774 scope.go:117] "RemoveContainer" containerID="b223d514ad97cb3e3ca217b60e774a83b4505e236401d379396d7f9cbe15237c" Oct 03 14:47:56 crc kubenswrapper[4774]: E1003 14:47:56.697232 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b223d514ad97cb3e3ca217b60e774a83b4505e236401d379396d7f9cbe15237c\": container with ID starting with b223d514ad97cb3e3ca217b60e774a83b4505e236401d379396d7f9cbe15237c not found: ID does not exist" containerID="b223d514ad97cb3e3ca217b60e774a83b4505e236401d379396d7f9cbe15237c" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.697269 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b223d514ad97cb3e3ca217b60e774a83b4505e236401d379396d7f9cbe15237c"} err="failed to get container status \"b223d514ad97cb3e3ca217b60e774a83b4505e236401d379396d7f9cbe15237c\": rpc error: code = NotFound desc = could not find container \"b223d514ad97cb3e3ca217b60e774a83b4505e236401d379396d7f9cbe15237c\": container with ID starting with b223d514ad97cb3e3ca217b60e774a83b4505e236401d379396d7f9cbe15237c not found: ID does not exist" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.697297 4774 scope.go:117] "RemoveContainer" containerID="241e7b66b12fb0ab7a133a4e3394b1e88df7231028e415a59af033fe842d0eae" Oct 03 14:47:56 crc kubenswrapper[4774]: E1003 14:47:56.697746 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"241e7b66b12fb0ab7a133a4e3394b1e88df7231028e415a59af033fe842d0eae\": container with ID starting with 241e7b66b12fb0ab7a133a4e3394b1e88df7231028e415a59af033fe842d0eae not found: ID does not exist" containerID="241e7b66b12fb0ab7a133a4e3394b1e88df7231028e415a59af033fe842d0eae" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.697771 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"241e7b66b12fb0ab7a133a4e3394b1e88df7231028e415a59af033fe842d0eae"} err="failed to get container status \"241e7b66b12fb0ab7a133a4e3394b1e88df7231028e415a59af033fe842d0eae\": rpc error: code = NotFound desc = could not find container \"241e7b66b12fb0ab7a133a4e3394b1e88df7231028e415a59af033fe842d0eae\": container with ID starting with 241e7b66b12fb0ab7a133a4e3394b1e88df7231028e415a59af033fe842d0eae not found: ID does not exist" Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.809896 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2bx5m"] Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.816681 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2bx5m"] Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.832198 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mg7ls"] Oct 03 14:47:56 crc kubenswrapper[4774]: I1003 14:47:56.835781 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mg7ls"] Oct 03 14:47:57 crc kubenswrapper[4774]: I1003 14:47:57.309660 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="062191e1-9f34-4dba-bd1c-9bffe53f5cfd" path="/var/lib/kubelet/pods/062191e1-9f34-4dba-bd1c-9bffe53f5cfd/volumes" Oct 03 14:47:57 crc kubenswrapper[4774]: I1003 14:47:57.311511 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2aad74fd-c975-42c4-8b05-ed28dbd55205" path="/var/lib/kubelet/pods/2aad74fd-c975-42c4-8b05-ed28dbd55205/volumes" Oct 03 14:47:57 crc kubenswrapper[4774]: I1003 14:47:57.312822 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8922cd85-deea-4326-88fc-68c67debf56c" path="/var/lib/kubelet/pods/8922cd85-deea-4326-88fc-68c67debf56c/volumes" Oct 03 14:47:57 crc kubenswrapper[4774]: I1003 14:47:57.314046 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="923b3b0f-6810-4504-9757-fe4761f6ed37" path="/var/lib/kubelet/pods/923b3b0f-6810-4504-9757-fe4761f6ed37/volumes" Oct 03 14:47:57 crc kubenswrapper[4774]: I1003 14:47:57.314516 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4a6d0e1-38db-42e1-8e29-b52de5bbad1d" path="/var/lib/kubelet/pods/e4a6d0e1-38db-42e1-8e29-b52de5bbad1d/volumes" Oct 03 14:47:57 crc kubenswrapper[4774]: I1003 14:47:57.508893 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p2hrr" event={"ID":"3fd11a50-e44d-4d7f-b301-6c7069bf6096","Type":"ContainerStarted","Data":"c35968d8efc495bc41a2034d0cf6c154c6319d833a184937e1b0fb916e3b6f88"} Oct 03 14:47:57 crc kubenswrapper[4774]: I1003 14:47:57.508946 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p2hrr" event={"ID":"3fd11a50-e44d-4d7f-b301-6c7069bf6096","Type":"ContainerStarted","Data":"ceadd83edaa3ec8f0da4edc27881c5b97402be9ba6bf101e2e461f6ab059e3df"} Oct 03 14:47:57 crc kubenswrapper[4774]: I1003 14:47:57.509160 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-p2hrr" Oct 03 14:47:57 crc kubenswrapper[4774]: I1003 14:47:57.515093 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-p2hrr" Oct 03 14:47:57 crc kubenswrapper[4774]: I1003 14:47:57.539325 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-p2hrr" podStartSLOduration=2.539304386 podStartE2EDuration="2.539304386s" podCreationTimestamp="2025-10-03 14:47:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:47:57.531165495 +0000 UTC m=+300.120368957" watchObservedRunningTime="2025-10-03 14:47:57.539304386 +0000 UTC m=+300.128507838" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.005264 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m7x75"] Oct 03 14:47:58 crc kubenswrapper[4774]: E1003 14:47:58.005497 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aad74fd-c975-42c4-8b05-ed28dbd55205" containerName="extract-utilities" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.005512 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aad74fd-c975-42c4-8b05-ed28dbd55205" containerName="extract-utilities" Oct 03 14:47:58 crc kubenswrapper[4774]: E1003 14:47:58.005522 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aad74fd-c975-42c4-8b05-ed28dbd55205" containerName="extract-content" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.005530 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aad74fd-c975-42c4-8b05-ed28dbd55205" containerName="extract-content" Oct 03 14:47:58 crc kubenswrapper[4774]: E1003 14:47:58.005543 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="062191e1-9f34-4dba-bd1c-9bffe53f5cfd" containerName="extract-content" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.005552 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="062191e1-9f34-4dba-bd1c-9bffe53f5cfd" containerName="extract-content" Oct 03 14:47:58 crc kubenswrapper[4774]: E1003 14:47:58.005565 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aad74fd-c975-42c4-8b05-ed28dbd55205" containerName="registry-server" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.005573 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aad74fd-c975-42c4-8b05-ed28dbd55205" containerName="registry-server" Oct 03 14:47:58 crc kubenswrapper[4774]: E1003 14:47:58.005584 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8922cd85-deea-4326-88fc-68c67debf56c" containerName="extract-content" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.005592 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="8922cd85-deea-4326-88fc-68c67debf56c" containerName="extract-content" Oct 03 14:47:58 crc kubenswrapper[4774]: E1003 14:47:58.005604 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8922cd85-deea-4326-88fc-68c67debf56c" containerName="extract-utilities" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.005611 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="8922cd85-deea-4326-88fc-68c67debf56c" containerName="extract-utilities" Oct 03 14:47:58 crc kubenswrapper[4774]: E1003 14:47:58.005623 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="923b3b0f-6810-4504-9757-fe4761f6ed37" containerName="marketplace-operator" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.005632 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="923b3b0f-6810-4504-9757-fe4761f6ed37" containerName="marketplace-operator" Oct 03 14:47:58 crc kubenswrapper[4774]: E1003 14:47:58.005648 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8922cd85-deea-4326-88fc-68c67debf56c" containerName="registry-server" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.005655 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="8922cd85-deea-4326-88fc-68c67debf56c" containerName="registry-server" Oct 03 14:47:58 crc kubenswrapper[4774]: E1003 14:47:58.005666 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a6d0e1-38db-42e1-8e29-b52de5bbad1d" containerName="extract-content" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.005674 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a6d0e1-38db-42e1-8e29-b52de5bbad1d" containerName="extract-content" Oct 03 14:47:58 crc kubenswrapper[4774]: E1003 14:47:58.005683 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a6d0e1-38db-42e1-8e29-b52de5bbad1d" containerName="registry-server" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.005690 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a6d0e1-38db-42e1-8e29-b52de5bbad1d" containerName="registry-server" Oct 03 14:47:58 crc kubenswrapper[4774]: E1003 14:47:58.005703 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="062191e1-9f34-4dba-bd1c-9bffe53f5cfd" containerName="extract-utilities" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.005713 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="062191e1-9f34-4dba-bd1c-9bffe53f5cfd" containerName="extract-utilities" Oct 03 14:47:58 crc kubenswrapper[4774]: E1003 14:47:58.005722 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a6d0e1-38db-42e1-8e29-b52de5bbad1d" containerName="extract-utilities" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.005729 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a6d0e1-38db-42e1-8e29-b52de5bbad1d" containerName="extract-utilities" Oct 03 14:47:58 crc kubenswrapper[4774]: E1003 14:47:58.006440 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="062191e1-9f34-4dba-bd1c-9bffe53f5cfd" containerName="registry-server" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.006456 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="062191e1-9f34-4dba-bd1c-9bffe53f5cfd" containerName="registry-server" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.006569 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="923b3b0f-6810-4504-9757-fe4761f6ed37" containerName="marketplace-operator" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.006587 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="062191e1-9f34-4dba-bd1c-9bffe53f5cfd" containerName="registry-server" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.006602 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="8922cd85-deea-4326-88fc-68c67debf56c" containerName="registry-server" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.006613 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4a6d0e1-38db-42e1-8e29-b52de5bbad1d" containerName="registry-server" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.006624 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aad74fd-c975-42c4-8b05-ed28dbd55205" containerName="registry-server" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.007473 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m7x75" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.009966 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.015401 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m7x75"] Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.029261 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f0d7089-cd30-47db-a3cc-44492151e300-utilities\") pod \"community-operators-m7x75\" (UID: \"8f0d7089-cd30-47db-a3cc-44492151e300\") " pod="openshift-marketplace/community-operators-m7x75" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.029490 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f0d7089-cd30-47db-a3cc-44492151e300-catalog-content\") pod \"community-operators-m7x75\" (UID: \"8f0d7089-cd30-47db-a3cc-44492151e300\") " pod="openshift-marketplace/community-operators-m7x75" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.029594 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7jrz\" (UniqueName: \"kubernetes.io/projected/8f0d7089-cd30-47db-a3cc-44492151e300-kube-api-access-w7jrz\") pod \"community-operators-m7x75\" (UID: \"8f0d7089-cd30-47db-a3cc-44492151e300\") " pod="openshift-marketplace/community-operators-m7x75" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.130455 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f0d7089-cd30-47db-a3cc-44492151e300-catalog-content\") pod \"community-operators-m7x75\" (UID: \"8f0d7089-cd30-47db-a3cc-44492151e300\") " pod="openshift-marketplace/community-operators-m7x75" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.130729 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7jrz\" (UniqueName: \"kubernetes.io/projected/8f0d7089-cd30-47db-a3cc-44492151e300-kube-api-access-w7jrz\") pod \"community-operators-m7x75\" (UID: \"8f0d7089-cd30-47db-a3cc-44492151e300\") " pod="openshift-marketplace/community-operators-m7x75" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.130886 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f0d7089-cd30-47db-a3cc-44492151e300-utilities\") pod \"community-operators-m7x75\" (UID: \"8f0d7089-cd30-47db-a3cc-44492151e300\") " pod="openshift-marketplace/community-operators-m7x75" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.131534 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f0d7089-cd30-47db-a3cc-44492151e300-utilities\") pod \"community-operators-m7x75\" (UID: \"8f0d7089-cd30-47db-a3cc-44492151e300\") " pod="openshift-marketplace/community-operators-m7x75" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.131882 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f0d7089-cd30-47db-a3cc-44492151e300-catalog-content\") pod \"community-operators-m7x75\" (UID: \"8f0d7089-cd30-47db-a3cc-44492151e300\") " pod="openshift-marketplace/community-operators-m7x75" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.158149 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7jrz\" (UniqueName: \"kubernetes.io/projected/8f0d7089-cd30-47db-a3cc-44492151e300-kube-api-access-w7jrz\") pod \"community-operators-m7x75\" (UID: \"8f0d7089-cd30-47db-a3cc-44492151e300\") " pod="openshift-marketplace/community-operators-m7x75" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.198144 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4knpn"] Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.199319 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4knpn" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.201664 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.211805 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4knpn"] Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.231998 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt7tb\" (UniqueName: \"kubernetes.io/projected/a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c-kube-api-access-dt7tb\") pod \"certified-operators-4knpn\" (UID: \"a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c\") " pod="openshift-marketplace/certified-operators-4knpn" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.232241 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c-utilities\") pod \"certified-operators-4knpn\" (UID: \"a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c\") " pod="openshift-marketplace/certified-operators-4knpn" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.232332 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c-catalog-content\") pod \"certified-operators-4knpn\" (UID: \"a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c\") " pod="openshift-marketplace/certified-operators-4knpn" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.327166 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m7x75" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.333802 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c-utilities\") pod \"certified-operators-4knpn\" (UID: \"a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c\") " pod="openshift-marketplace/certified-operators-4knpn" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.333871 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c-catalog-content\") pod \"certified-operators-4knpn\" (UID: \"a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c\") " pod="openshift-marketplace/certified-operators-4knpn" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.333966 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt7tb\" (UniqueName: \"kubernetes.io/projected/a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c-kube-api-access-dt7tb\") pod \"certified-operators-4knpn\" (UID: \"a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c\") " pod="openshift-marketplace/certified-operators-4knpn" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.334602 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c-utilities\") pod \"certified-operators-4knpn\" (UID: \"a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c\") " pod="openshift-marketplace/certified-operators-4knpn" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.335075 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c-catalog-content\") pod \"certified-operators-4knpn\" (UID: \"a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c\") " pod="openshift-marketplace/certified-operators-4knpn" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.350631 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt7tb\" (UniqueName: \"kubernetes.io/projected/a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c-kube-api-access-dt7tb\") pod \"certified-operators-4knpn\" (UID: \"a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c\") " pod="openshift-marketplace/certified-operators-4knpn" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.518999 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4knpn" Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.712800 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m7x75"] Oct 03 14:47:58 crc kubenswrapper[4774]: W1003 14:47:58.727033 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f0d7089_cd30_47db_a3cc_44492151e300.slice/crio-2818ca61915c86816c0a848223ab7d0197df309f45f055fe43c5c686481651bb WatchSource:0}: Error finding container 2818ca61915c86816c0a848223ab7d0197df309f45f055fe43c5c686481651bb: Status 404 returned error can't find the container with id 2818ca61915c86816c0a848223ab7d0197df309f45f055fe43c5c686481651bb Oct 03 14:47:58 crc kubenswrapper[4774]: I1003 14:47:58.735274 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4knpn"] Oct 03 14:47:58 crc kubenswrapper[4774]: W1003 14:47:58.741696 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6a812a9_1e31_4cdc_ab9c_7f3f7af2977c.slice/crio-883798d27830b705ac721f5dcb27c6af41472f6667215c8f77eb7eff4c942346 WatchSource:0}: Error finding container 883798d27830b705ac721f5dcb27c6af41472f6667215c8f77eb7eff4c942346: Status 404 returned error can't find the container with id 883798d27830b705ac721f5dcb27c6af41472f6667215c8f77eb7eff4c942346 Oct 03 14:47:59 crc kubenswrapper[4774]: I1003 14:47:59.524777 4774 generic.go:334] "Generic (PLEG): container finished" podID="a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c" containerID="c58ab45d72f13046276ccb283f7d70da71aea9a95566aa2b1bd92fbdab8c4cc0" exitCode=0 Oct 03 14:47:59 crc kubenswrapper[4774]: I1003 14:47:59.524862 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4knpn" event={"ID":"a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c","Type":"ContainerDied","Data":"c58ab45d72f13046276ccb283f7d70da71aea9a95566aa2b1bd92fbdab8c4cc0"} Oct 03 14:47:59 crc kubenswrapper[4774]: I1003 14:47:59.526272 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4knpn" event={"ID":"a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c","Type":"ContainerStarted","Data":"883798d27830b705ac721f5dcb27c6af41472f6667215c8f77eb7eff4c942346"} Oct 03 14:47:59 crc kubenswrapper[4774]: I1003 14:47:59.528511 4774 generic.go:334] "Generic (PLEG): container finished" podID="8f0d7089-cd30-47db-a3cc-44492151e300" containerID="e92d3b39fae3303420398a4feadc02734b6003270df34be29de71ce1567d2f5d" exitCode=0 Oct 03 14:47:59 crc kubenswrapper[4774]: I1003 14:47:59.528602 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7x75" event={"ID":"8f0d7089-cd30-47db-a3cc-44492151e300","Type":"ContainerDied","Data":"e92d3b39fae3303420398a4feadc02734b6003270df34be29de71ce1567d2f5d"} Oct 03 14:47:59 crc kubenswrapper[4774]: I1003 14:47:59.528640 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7x75" event={"ID":"8f0d7089-cd30-47db-a3cc-44492151e300","Type":"ContainerStarted","Data":"2818ca61915c86816c0a848223ab7d0197df309f45f055fe43c5c686481651bb"} Oct 03 14:48:00 crc kubenswrapper[4774]: I1003 14:48:00.395587 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2gxlv"] Oct 03 14:48:00 crc kubenswrapper[4774]: I1003 14:48:00.397623 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2gxlv" Oct 03 14:48:00 crc kubenswrapper[4774]: I1003 14:48:00.401659 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 03 14:48:00 crc kubenswrapper[4774]: I1003 14:48:00.415792 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2gxlv"] Oct 03 14:48:00 crc kubenswrapper[4774]: I1003 14:48:00.535461 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7x75" event={"ID":"8f0d7089-cd30-47db-a3cc-44492151e300","Type":"ContainerStarted","Data":"79381b86a54a118193dcbd1b33eaf2644712126fa0ecbbbd418f974b8fdf7ded"} Oct 03 14:48:00 crc kubenswrapper[4774]: I1003 14:48:00.556491 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6-catalog-content\") pod \"redhat-marketplace-2gxlv\" (UID: \"cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6\") " pod="openshift-marketplace/redhat-marketplace-2gxlv" Oct 03 14:48:00 crc kubenswrapper[4774]: I1003 14:48:00.556674 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhtd8\" (UniqueName: \"kubernetes.io/projected/cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6-kube-api-access-dhtd8\") pod \"redhat-marketplace-2gxlv\" (UID: \"cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6\") " pod="openshift-marketplace/redhat-marketplace-2gxlv" Oct 03 14:48:00 crc kubenswrapper[4774]: I1003 14:48:00.556734 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6-utilities\") pod \"redhat-marketplace-2gxlv\" (UID: \"cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6\") " pod="openshift-marketplace/redhat-marketplace-2gxlv" Oct 03 14:48:00 crc kubenswrapper[4774]: I1003 14:48:00.596810 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mx9vw"] Oct 03 14:48:00 crc kubenswrapper[4774]: I1003 14:48:00.598135 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mx9vw" Oct 03 14:48:00 crc kubenswrapper[4774]: I1003 14:48:00.601074 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 03 14:48:00 crc kubenswrapper[4774]: I1003 14:48:00.609267 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mx9vw"] Oct 03 14:48:00 crc kubenswrapper[4774]: I1003 14:48:00.657926 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0c1612a-d998-4683-abf4-433f470c76b1-utilities\") pod \"redhat-operators-mx9vw\" (UID: \"f0c1612a-d998-4683-abf4-433f470c76b1\") " pod="openshift-marketplace/redhat-operators-mx9vw" Oct 03 14:48:00 crc kubenswrapper[4774]: I1003 14:48:00.657982 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6-catalog-content\") pod \"redhat-marketplace-2gxlv\" (UID: \"cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6\") " pod="openshift-marketplace/redhat-marketplace-2gxlv" Oct 03 14:48:00 crc kubenswrapper[4774]: I1003 14:48:00.658035 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhtd8\" (UniqueName: \"kubernetes.io/projected/cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6-kube-api-access-dhtd8\") pod \"redhat-marketplace-2gxlv\" (UID: \"cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6\") " pod="openshift-marketplace/redhat-marketplace-2gxlv" Oct 03 14:48:00 crc kubenswrapper[4774]: I1003 14:48:00.658075 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4c5x\" (UniqueName: \"kubernetes.io/projected/f0c1612a-d998-4683-abf4-433f470c76b1-kube-api-access-n4c5x\") pod \"redhat-operators-mx9vw\" (UID: \"f0c1612a-d998-4683-abf4-433f470c76b1\") " pod="openshift-marketplace/redhat-operators-mx9vw" Oct 03 14:48:00 crc kubenswrapper[4774]: I1003 14:48:00.658105 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6-utilities\") pod \"redhat-marketplace-2gxlv\" (UID: \"cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6\") " pod="openshift-marketplace/redhat-marketplace-2gxlv" Oct 03 14:48:00 crc kubenswrapper[4774]: I1003 14:48:00.658122 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0c1612a-d998-4683-abf4-433f470c76b1-catalog-content\") pod \"redhat-operators-mx9vw\" (UID: \"f0c1612a-d998-4683-abf4-433f470c76b1\") " pod="openshift-marketplace/redhat-operators-mx9vw" Oct 03 14:48:00 crc kubenswrapper[4774]: I1003 14:48:00.658534 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6-catalog-content\") pod \"redhat-marketplace-2gxlv\" (UID: \"cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6\") " pod="openshift-marketplace/redhat-marketplace-2gxlv" Oct 03 14:48:00 crc kubenswrapper[4774]: I1003 14:48:00.658798 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6-utilities\") pod \"redhat-marketplace-2gxlv\" (UID: \"cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6\") " pod="openshift-marketplace/redhat-marketplace-2gxlv" Oct 03 14:48:00 crc kubenswrapper[4774]: I1003 14:48:00.675969 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhtd8\" (UniqueName: \"kubernetes.io/projected/cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6-kube-api-access-dhtd8\") pod \"redhat-marketplace-2gxlv\" (UID: \"cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6\") " pod="openshift-marketplace/redhat-marketplace-2gxlv" Oct 03 14:48:00 crc kubenswrapper[4774]: I1003 14:48:00.718634 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2gxlv" Oct 03 14:48:00 crc kubenswrapper[4774]: I1003 14:48:00.759305 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0c1612a-d998-4683-abf4-433f470c76b1-catalog-content\") pod \"redhat-operators-mx9vw\" (UID: \"f0c1612a-d998-4683-abf4-433f470c76b1\") " pod="openshift-marketplace/redhat-operators-mx9vw" Oct 03 14:48:00 crc kubenswrapper[4774]: I1003 14:48:00.759363 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0c1612a-d998-4683-abf4-433f470c76b1-utilities\") pod \"redhat-operators-mx9vw\" (UID: \"f0c1612a-d998-4683-abf4-433f470c76b1\") " pod="openshift-marketplace/redhat-operators-mx9vw" Oct 03 14:48:00 crc kubenswrapper[4774]: I1003 14:48:00.759454 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4c5x\" (UniqueName: \"kubernetes.io/projected/f0c1612a-d998-4683-abf4-433f470c76b1-kube-api-access-n4c5x\") pod \"redhat-operators-mx9vw\" (UID: \"f0c1612a-d998-4683-abf4-433f470c76b1\") " pod="openshift-marketplace/redhat-operators-mx9vw" Oct 03 14:48:00 crc kubenswrapper[4774]: I1003 14:48:00.759880 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0c1612a-d998-4683-abf4-433f470c76b1-utilities\") pod \"redhat-operators-mx9vw\" (UID: \"f0c1612a-d998-4683-abf4-433f470c76b1\") " pod="openshift-marketplace/redhat-operators-mx9vw" Oct 03 14:48:00 crc kubenswrapper[4774]: I1003 14:48:00.759881 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0c1612a-d998-4683-abf4-433f470c76b1-catalog-content\") pod \"redhat-operators-mx9vw\" (UID: \"f0c1612a-d998-4683-abf4-433f470c76b1\") " pod="openshift-marketplace/redhat-operators-mx9vw" Oct 03 14:48:00 crc kubenswrapper[4774]: I1003 14:48:00.780449 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4c5x\" (UniqueName: \"kubernetes.io/projected/f0c1612a-d998-4683-abf4-433f470c76b1-kube-api-access-n4c5x\") pod \"redhat-operators-mx9vw\" (UID: \"f0c1612a-d998-4683-abf4-433f470c76b1\") " pod="openshift-marketplace/redhat-operators-mx9vw" Oct 03 14:48:00 crc kubenswrapper[4774]: I1003 14:48:00.917279 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mx9vw" Oct 03 14:48:00 crc kubenswrapper[4774]: I1003 14:48:00.937864 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2gxlv"] Oct 03 14:48:01 crc kubenswrapper[4774]: I1003 14:48:01.117588 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mx9vw"] Oct 03 14:48:01 crc kubenswrapper[4774]: W1003 14:48:01.149121 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0c1612a_d998_4683_abf4_433f470c76b1.slice/crio-afc5f319dbf4a9cdfa366c1bc0505d8c82978d82b8f017105ffd234dd27730d2 WatchSource:0}: Error finding container afc5f319dbf4a9cdfa366c1bc0505d8c82978d82b8f017105ffd234dd27730d2: Status 404 returned error can't find the container with id afc5f319dbf4a9cdfa366c1bc0505d8c82978d82b8f017105ffd234dd27730d2 Oct 03 14:48:01 crc kubenswrapper[4774]: I1003 14:48:01.541857 4774 generic.go:334] "Generic (PLEG): container finished" podID="cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6" containerID="6c1f9514a17f9234a9df38a967ae34c1900d5f23a4830304d679b9c5b145826f" exitCode=0 Oct 03 14:48:01 crc kubenswrapper[4774]: I1003 14:48:01.541927 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2gxlv" event={"ID":"cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6","Type":"ContainerDied","Data":"6c1f9514a17f9234a9df38a967ae34c1900d5f23a4830304d679b9c5b145826f"} Oct 03 14:48:01 crc kubenswrapper[4774]: I1003 14:48:01.542204 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2gxlv" event={"ID":"cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6","Type":"ContainerStarted","Data":"caa92290625d627074437f73fb1e34be578ec24cafc5d0d95826995db5cc737e"} Oct 03 14:48:01 crc kubenswrapper[4774]: I1003 14:48:01.544191 4774 generic.go:334] "Generic (PLEG): container finished" podID="a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c" containerID="9acb6fe694b5ad938cb9f13b3d01bb7659b70a901cea539638b628f267570343" exitCode=0 Oct 03 14:48:01 crc kubenswrapper[4774]: I1003 14:48:01.544253 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4knpn" event={"ID":"a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c","Type":"ContainerDied","Data":"9acb6fe694b5ad938cb9f13b3d01bb7659b70a901cea539638b628f267570343"} Oct 03 14:48:01 crc kubenswrapper[4774]: I1003 14:48:01.548812 4774 generic.go:334] "Generic (PLEG): container finished" podID="8f0d7089-cd30-47db-a3cc-44492151e300" containerID="79381b86a54a118193dcbd1b33eaf2644712126fa0ecbbbd418f974b8fdf7ded" exitCode=0 Oct 03 14:48:01 crc kubenswrapper[4774]: I1003 14:48:01.548883 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7x75" event={"ID":"8f0d7089-cd30-47db-a3cc-44492151e300","Type":"ContainerDied","Data":"79381b86a54a118193dcbd1b33eaf2644712126fa0ecbbbd418f974b8fdf7ded"} Oct 03 14:48:01 crc kubenswrapper[4774]: I1003 14:48:01.550922 4774 generic.go:334] "Generic (PLEG): container finished" podID="f0c1612a-d998-4683-abf4-433f470c76b1" containerID="245082377cf839aa61020bec406cb8122948cd542ea71a41d8bd41134436b08f" exitCode=0 Oct 03 14:48:01 crc kubenswrapper[4774]: I1003 14:48:01.550967 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mx9vw" event={"ID":"f0c1612a-d998-4683-abf4-433f470c76b1","Type":"ContainerDied","Data":"245082377cf839aa61020bec406cb8122948cd542ea71a41d8bd41134436b08f"} Oct 03 14:48:01 crc kubenswrapper[4774]: I1003 14:48:01.550996 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mx9vw" event={"ID":"f0c1612a-d998-4683-abf4-433f470c76b1","Type":"ContainerStarted","Data":"afc5f319dbf4a9cdfa366c1bc0505d8c82978d82b8f017105ffd234dd27730d2"} Oct 03 14:48:02 crc kubenswrapper[4774]: I1003 14:48:02.558078 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mx9vw" event={"ID":"f0c1612a-d998-4683-abf4-433f470c76b1","Type":"ContainerStarted","Data":"9663cfa6e22ab53ce4f9c84413d01f19dc5829409e6fa71c53e5b8fe9f5f8639"} Oct 03 14:48:02 crc kubenswrapper[4774]: I1003 14:48:02.559806 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7x75" event={"ID":"8f0d7089-cd30-47db-a3cc-44492151e300","Type":"ContainerStarted","Data":"ecd60bcf1abe61982f8fd15ce3c99c502fe0b375fc8d40e6af8557889b7ea909"} Oct 03 14:48:02 crc kubenswrapper[4774]: I1003 14:48:02.561614 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4knpn" event={"ID":"a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c","Type":"ContainerStarted","Data":"4a5615b2aeef9825e7d3fdcd53099365b08f8bb66d71d106bd55cc2dc685f22b"} Oct 03 14:48:02 crc kubenswrapper[4774]: I1003 14:48:02.596485 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4knpn" podStartSLOduration=1.910511815 podStartE2EDuration="4.596463092s" podCreationTimestamp="2025-10-03 14:47:58 +0000 UTC" firstStartedPulling="2025-10-03 14:47:59.528265841 +0000 UTC m=+302.117469303" lastFinishedPulling="2025-10-03 14:48:02.214217118 +0000 UTC m=+304.803420580" observedRunningTime="2025-10-03 14:48:02.59413867 +0000 UTC m=+305.183342132" watchObservedRunningTime="2025-10-03 14:48:02.596463092 +0000 UTC m=+305.185666544" Oct 03 14:48:02 crc kubenswrapper[4774]: I1003 14:48:02.610448 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m7x75" podStartSLOduration=3.179966416 podStartE2EDuration="5.610429552s" podCreationTimestamp="2025-10-03 14:47:57 +0000 UTC" firstStartedPulling="2025-10-03 14:47:59.529960439 +0000 UTC m=+302.119163891" lastFinishedPulling="2025-10-03 14:48:01.960423575 +0000 UTC m=+304.549627027" observedRunningTime="2025-10-03 14:48:02.60716748 +0000 UTC m=+305.196370932" watchObservedRunningTime="2025-10-03 14:48:02.610429552 +0000 UTC m=+305.199633004" Oct 03 14:48:03 crc kubenswrapper[4774]: I1003 14:48:03.567562 4774 generic.go:334] "Generic (PLEG): container finished" podID="f0c1612a-d998-4683-abf4-433f470c76b1" containerID="9663cfa6e22ab53ce4f9c84413d01f19dc5829409e6fa71c53e5b8fe9f5f8639" exitCode=0 Oct 03 14:48:03 crc kubenswrapper[4774]: I1003 14:48:03.567906 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mx9vw" event={"ID":"f0c1612a-d998-4683-abf4-433f470c76b1","Type":"ContainerDied","Data":"9663cfa6e22ab53ce4f9c84413d01f19dc5829409e6fa71c53e5b8fe9f5f8639"} Oct 03 14:48:03 crc kubenswrapper[4774]: I1003 14:48:03.573333 4774 generic.go:334] "Generic (PLEG): container finished" podID="cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6" containerID="f0f2d8594c526b91ac9a0b042ceb623ace42ef64fd025bd3737e4b620052e55d" exitCode=0 Oct 03 14:48:03 crc kubenswrapper[4774]: I1003 14:48:03.573411 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2gxlv" event={"ID":"cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6","Type":"ContainerDied","Data":"f0f2d8594c526b91ac9a0b042ceb623ace42ef64fd025bd3737e4b620052e55d"} Oct 03 14:48:04 crc kubenswrapper[4774]: I1003 14:48:04.579765 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mx9vw" event={"ID":"f0c1612a-d998-4683-abf4-433f470c76b1","Type":"ContainerStarted","Data":"252b4f0932a80d33983f600dc112178d5bca12bbdb283611b81045783781ec36"} Oct 03 14:48:04 crc kubenswrapper[4774]: I1003 14:48:04.583170 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2gxlv" event={"ID":"cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6","Type":"ContainerStarted","Data":"749ed1005f379ce42386304c92cf87fd124e3a612d3a962eb17363993d229050"} Oct 03 14:48:04 crc kubenswrapper[4774]: I1003 14:48:04.595235 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mx9vw" podStartSLOduration=2.1367153979999998 podStartE2EDuration="4.595212985s" podCreationTimestamp="2025-10-03 14:48:00 +0000 UTC" firstStartedPulling="2025-10-03 14:48:01.551949088 +0000 UTC m=+304.141152530" lastFinishedPulling="2025-10-03 14:48:04.010446665 +0000 UTC m=+306.599650117" observedRunningTime="2025-10-03 14:48:04.595096002 +0000 UTC m=+307.184299464" watchObservedRunningTime="2025-10-03 14:48:04.595212985 +0000 UTC m=+307.184416437" Oct 03 14:48:04 crc kubenswrapper[4774]: I1003 14:48:04.611123 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2gxlv" podStartSLOduration=1.685145065 podStartE2EDuration="4.611068577s" podCreationTimestamp="2025-10-03 14:48:00 +0000 UTC" firstStartedPulling="2025-10-03 14:48:01.543172624 +0000 UTC m=+304.132376076" lastFinishedPulling="2025-10-03 14:48:04.469096146 +0000 UTC m=+307.058299588" observedRunningTime="2025-10-03 14:48:04.608142342 +0000 UTC m=+307.197345794" watchObservedRunningTime="2025-10-03 14:48:04.611068577 +0000 UTC m=+307.200272019" Oct 03 14:48:08 crc kubenswrapper[4774]: I1003 14:48:08.328243 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m7x75" Oct 03 14:48:08 crc kubenswrapper[4774]: I1003 14:48:08.328611 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m7x75" Oct 03 14:48:08 crc kubenswrapper[4774]: I1003 14:48:08.370000 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m7x75" Oct 03 14:48:08 crc kubenswrapper[4774]: I1003 14:48:08.520049 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4knpn" Oct 03 14:48:08 crc kubenswrapper[4774]: I1003 14:48:08.520362 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4knpn" Oct 03 14:48:08 crc kubenswrapper[4774]: I1003 14:48:08.562115 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4knpn" Oct 03 14:48:08 crc kubenswrapper[4774]: I1003 14:48:08.643843 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m7x75" Oct 03 14:48:08 crc kubenswrapper[4774]: I1003 14:48:08.645485 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4knpn" Oct 03 14:48:10 crc kubenswrapper[4774]: I1003 14:48:10.718976 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2gxlv" Oct 03 14:48:10 crc kubenswrapper[4774]: I1003 14:48:10.719487 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2gxlv" Oct 03 14:48:10 crc kubenswrapper[4774]: I1003 14:48:10.761830 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2gxlv" Oct 03 14:48:10 crc kubenswrapper[4774]: I1003 14:48:10.918278 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mx9vw" Oct 03 14:48:10 crc kubenswrapper[4774]: I1003 14:48:10.918335 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mx9vw" Oct 03 14:48:10 crc kubenswrapper[4774]: I1003 14:48:10.953808 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mx9vw" Oct 03 14:48:11 crc kubenswrapper[4774]: I1003 14:48:11.663718 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mx9vw" Oct 03 14:48:11 crc kubenswrapper[4774]: I1003 14:48:11.687885 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2gxlv" Oct 03 14:49:20 crc kubenswrapper[4774]: I1003 14:49:20.654328 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:49:20 crc kubenswrapper[4774]: I1003 14:49:20.654962 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:49:50 crc kubenswrapper[4774]: I1003 14:49:50.653819 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:49:50 crc kubenswrapper[4774]: I1003 14:49:50.654254 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:50:00 crc kubenswrapper[4774]: I1003 14:50:00.192442 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-j2s56"] Oct 03 14:50:00 crc kubenswrapper[4774]: I1003 14:50:00.193769 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-j2s56" Oct 03 14:50:00 crc kubenswrapper[4774]: I1003 14:50:00.223811 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-j2s56"] Oct 03 14:50:00 crc kubenswrapper[4774]: I1003 14:50:00.317980 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/230a7f18-c188-413d-bbb9-4f1da44357f7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-j2s56\" (UID: \"230a7f18-c188-413d-bbb9-4f1da44357f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2s56" Oct 03 14:50:00 crc kubenswrapper[4774]: I1003 14:50:00.318049 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/230a7f18-c188-413d-bbb9-4f1da44357f7-registry-tls\") pod \"image-registry-66df7c8f76-j2s56\" (UID: \"230a7f18-c188-413d-bbb9-4f1da44357f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2s56" Oct 03 14:50:00 crc kubenswrapper[4774]: I1003 14:50:00.318179 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/230a7f18-c188-413d-bbb9-4f1da44357f7-bound-sa-token\") pod \"image-registry-66df7c8f76-j2s56\" (UID: \"230a7f18-c188-413d-bbb9-4f1da44357f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2s56" Oct 03 14:50:00 crc kubenswrapper[4774]: I1003 14:50:00.318321 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/230a7f18-c188-413d-bbb9-4f1da44357f7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-j2s56\" (UID: \"230a7f18-c188-413d-bbb9-4f1da44357f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2s56" Oct 03 14:50:00 crc kubenswrapper[4774]: I1003 14:50:00.318388 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75759\" (UniqueName: \"kubernetes.io/projected/230a7f18-c188-413d-bbb9-4f1da44357f7-kube-api-access-75759\") pod \"image-registry-66df7c8f76-j2s56\" (UID: \"230a7f18-c188-413d-bbb9-4f1da44357f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2s56" Oct 03 14:50:00 crc kubenswrapper[4774]: I1003 14:50:00.318414 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/230a7f18-c188-413d-bbb9-4f1da44357f7-trusted-ca\") pod \"image-registry-66df7c8f76-j2s56\" (UID: \"230a7f18-c188-413d-bbb9-4f1da44357f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2s56" Oct 03 14:50:00 crc kubenswrapper[4774]: I1003 14:50:00.318497 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-j2s56\" (UID: \"230a7f18-c188-413d-bbb9-4f1da44357f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2s56" Oct 03 14:50:00 crc kubenswrapper[4774]: I1003 14:50:00.318538 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/230a7f18-c188-413d-bbb9-4f1da44357f7-registry-certificates\") pod \"image-registry-66df7c8f76-j2s56\" (UID: \"230a7f18-c188-413d-bbb9-4f1da44357f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2s56" Oct 03 14:50:00 crc kubenswrapper[4774]: I1003 14:50:00.337820 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-j2s56\" (UID: \"230a7f18-c188-413d-bbb9-4f1da44357f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2s56" Oct 03 14:50:00 crc kubenswrapper[4774]: I1003 14:50:00.419866 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75759\" (UniqueName: \"kubernetes.io/projected/230a7f18-c188-413d-bbb9-4f1da44357f7-kube-api-access-75759\") pod \"image-registry-66df7c8f76-j2s56\" (UID: \"230a7f18-c188-413d-bbb9-4f1da44357f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2s56" Oct 03 14:50:00 crc kubenswrapper[4774]: I1003 14:50:00.419911 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/230a7f18-c188-413d-bbb9-4f1da44357f7-trusted-ca\") pod \"image-registry-66df7c8f76-j2s56\" (UID: \"230a7f18-c188-413d-bbb9-4f1da44357f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2s56" Oct 03 14:50:00 crc kubenswrapper[4774]: I1003 14:50:00.419938 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/230a7f18-c188-413d-bbb9-4f1da44357f7-registry-certificates\") pod \"image-registry-66df7c8f76-j2s56\" (UID: \"230a7f18-c188-413d-bbb9-4f1da44357f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2s56" Oct 03 14:50:00 crc kubenswrapper[4774]: I1003 14:50:00.419972 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/230a7f18-c188-413d-bbb9-4f1da44357f7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-j2s56\" (UID: \"230a7f18-c188-413d-bbb9-4f1da44357f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2s56" Oct 03 14:50:00 crc kubenswrapper[4774]: I1003 14:50:00.420031 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/230a7f18-c188-413d-bbb9-4f1da44357f7-registry-tls\") pod \"image-registry-66df7c8f76-j2s56\" (UID: \"230a7f18-c188-413d-bbb9-4f1da44357f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2s56" Oct 03 14:50:00 crc kubenswrapper[4774]: I1003 14:50:00.420064 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/230a7f18-c188-413d-bbb9-4f1da44357f7-bound-sa-token\") pod \"image-registry-66df7c8f76-j2s56\" (UID: \"230a7f18-c188-413d-bbb9-4f1da44357f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2s56" Oct 03 14:50:00 crc kubenswrapper[4774]: I1003 14:50:00.420109 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/230a7f18-c188-413d-bbb9-4f1da44357f7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-j2s56\" (UID: \"230a7f18-c188-413d-bbb9-4f1da44357f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2s56" Oct 03 14:50:00 crc kubenswrapper[4774]: I1003 14:50:00.421196 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/230a7f18-c188-413d-bbb9-4f1da44357f7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-j2s56\" (UID: \"230a7f18-c188-413d-bbb9-4f1da44357f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2s56" Oct 03 14:50:00 crc kubenswrapper[4774]: I1003 14:50:00.421997 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/230a7f18-c188-413d-bbb9-4f1da44357f7-registry-certificates\") pod \"image-registry-66df7c8f76-j2s56\" (UID: \"230a7f18-c188-413d-bbb9-4f1da44357f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2s56" Oct 03 14:50:00 crc kubenswrapper[4774]: I1003 14:50:00.422020 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/230a7f18-c188-413d-bbb9-4f1da44357f7-trusted-ca\") pod \"image-registry-66df7c8f76-j2s56\" (UID: \"230a7f18-c188-413d-bbb9-4f1da44357f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2s56" Oct 03 14:50:00 crc kubenswrapper[4774]: I1003 14:50:00.426338 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/230a7f18-c188-413d-bbb9-4f1da44357f7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-j2s56\" (UID: \"230a7f18-c188-413d-bbb9-4f1da44357f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2s56" Oct 03 14:50:00 crc kubenswrapper[4774]: I1003 14:50:00.426424 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/230a7f18-c188-413d-bbb9-4f1da44357f7-registry-tls\") pod \"image-registry-66df7c8f76-j2s56\" (UID: \"230a7f18-c188-413d-bbb9-4f1da44357f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2s56" Oct 03 14:50:00 crc kubenswrapper[4774]: I1003 14:50:00.437009 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75759\" (UniqueName: \"kubernetes.io/projected/230a7f18-c188-413d-bbb9-4f1da44357f7-kube-api-access-75759\") pod \"image-registry-66df7c8f76-j2s56\" (UID: \"230a7f18-c188-413d-bbb9-4f1da44357f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2s56" Oct 03 14:50:00 crc kubenswrapper[4774]: I1003 14:50:00.441503 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/230a7f18-c188-413d-bbb9-4f1da44357f7-bound-sa-token\") pod \"image-registry-66df7c8f76-j2s56\" (UID: \"230a7f18-c188-413d-bbb9-4f1da44357f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-j2s56" Oct 03 14:50:00 crc kubenswrapper[4774]: I1003 14:50:00.511048 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-j2s56" Oct 03 14:50:00 crc kubenswrapper[4774]: I1003 14:50:00.693696 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-j2s56"] Oct 03 14:50:00 crc kubenswrapper[4774]: W1003 14:50:00.702330 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod230a7f18_c188_413d_bbb9_4f1da44357f7.slice/crio-95957509cd953be7fddf9f608a432aaf08bc485c787495af7504c1526b50c251 WatchSource:0}: Error finding container 95957509cd953be7fddf9f608a432aaf08bc485c787495af7504c1526b50c251: Status 404 returned error can't find the container with id 95957509cd953be7fddf9f608a432aaf08bc485c787495af7504c1526b50c251 Oct 03 14:50:01 crc kubenswrapper[4774]: I1003 14:50:01.247665 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-j2s56" event={"ID":"230a7f18-c188-413d-bbb9-4f1da44357f7","Type":"ContainerStarted","Data":"1044d4ef42ed21df50c2dc96afca4ca23ea2b66f10db82d943487b3611dbb1fd"} Oct 03 14:50:01 crc kubenswrapper[4774]: I1003 14:50:01.247714 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-j2s56" event={"ID":"230a7f18-c188-413d-bbb9-4f1da44357f7","Type":"ContainerStarted","Data":"95957509cd953be7fddf9f608a432aaf08bc485c787495af7504c1526b50c251"} Oct 03 14:50:01 crc kubenswrapper[4774]: I1003 14:50:01.247830 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-j2s56" Oct 03 14:50:20 crc kubenswrapper[4774]: I1003 14:50:20.517454 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-j2s56" Oct 03 14:50:20 crc kubenswrapper[4774]: I1003 14:50:20.544716 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-j2s56" podStartSLOduration=20.544687588 podStartE2EDuration="20.544687588s" podCreationTimestamp="2025-10-03 14:50:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:50:01.27583765 +0000 UTC m=+423.865041142" watchObservedRunningTime="2025-10-03 14:50:20.544687588 +0000 UTC m=+443.133891080" Oct 03 14:50:20 crc kubenswrapper[4774]: I1003 14:50:20.574560 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4sjxw"] Oct 03 14:50:20 crc kubenswrapper[4774]: I1003 14:50:20.653615 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:50:20 crc kubenswrapper[4774]: I1003 14:50:20.653684 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:50:20 crc kubenswrapper[4774]: I1003 14:50:20.653760 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" Oct 03 14:50:20 crc kubenswrapper[4774]: I1003 14:50:20.654295 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c5ad343ebc4fbb79d9c25d28242be0ed044b9a7ef63c6a844189fb94cda2175a"} pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 14:50:20 crc kubenswrapper[4774]: I1003 14:50:20.654348 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" containerID="cri-o://c5ad343ebc4fbb79d9c25d28242be0ed044b9a7ef63c6a844189fb94cda2175a" gracePeriod=600 Oct 03 14:50:21 crc kubenswrapper[4774]: I1003 14:50:21.365678 4774 generic.go:334] "Generic (PLEG): container finished" podID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerID="c5ad343ebc4fbb79d9c25d28242be0ed044b9a7ef63c6a844189fb94cda2175a" exitCode=0 Oct 03 14:50:21 crc kubenswrapper[4774]: I1003 14:50:21.365762 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" event={"ID":"ca37ac4b-f421-4198-a179-12901d36f0f5","Type":"ContainerDied","Data":"c5ad343ebc4fbb79d9c25d28242be0ed044b9a7ef63c6a844189fb94cda2175a"} Oct 03 14:50:21 crc kubenswrapper[4774]: I1003 14:50:21.366579 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" event={"ID":"ca37ac4b-f421-4198-a179-12901d36f0f5","Type":"ContainerStarted","Data":"6084213ff4d2e0141161ed1ee2c6ddcac4069da8f5bf554f9b528002b00a69ce"} Oct 03 14:50:21 crc kubenswrapper[4774]: I1003 14:50:21.366628 4774 scope.go:117] "RemoveContainer" containerID="bac2decca2a2a2fda80b2eb3cf96d985ca649fc4317858f0c2cb356d57d5c055" Oct 03 14:50:45 crc kubenswrapper[4774]: I1003 14:50:45.621359 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" podUID="7d110133-f0c7-4b4c-a060-5a0fdb950e9f" containerName="registry" containerID="cri-o://f1f700616a1ad94d60892ac87f217e84ce5a9812c3fcc3832fd740a1c6f1e7f2" gracePeriod=30 Oct 03 14:50:45 crc kubenswrapper[4774]: I1003 14:50:45.974270 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:50:45 crc kubenswrapper[4774]: I1003 14:50:45.985258 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-registry-tls\") pod \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " Oct 03 14:50:45 crc kubenswrapper[4774]: I1003 14:50:45.985310 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlqr9\" (UniqueName: \"kubernetes.io/projected/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-kube-api-access-xlqr9\") pod \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " Oct 03 14:50:45 crc kubenswrapper[4774]: I1003 14:50:45.985349 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-ca-trust-extracted\") pod \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " Oct 03 14:50:45 crc kubenswrapper[4774]: I1003 14:50:45.985364 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-trusted-ca\") pod \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " Oct 03 14:50:45 crc kubenswrapper[4774]: I1003 14:50:45.985556 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " Oct 03 14:50:45 crc kubenswrapper[4774]: I1003 14:50:45.985596 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-bound-sa-token\") pod \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " Oct 03 14:50:45 crc kubenswrapper[4774]: I1003 14:50:45.985620 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-installation-pull-secrets\") pod \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " Oct 03 14:50:45 crc kubenswrapper[4774]: I1003 14:50:45.985680 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-registry-certificates\") pod \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\" (UID: \"7d110133-f0c7-4b4c-a060-5a0fdb950e9f\") " Oct 03 14:50:45 crc kubenswrapper[4774]: I1003 14:50:45.986362 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "7d110133-f0c7-4b4c-a060-5a0fdb950e9f" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:50:45 crc kubenswrapper[4774]: I1003 14:50:45.986400 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "7d110133-f0c7-4b4c-a060-5a0fdb950e9f" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:50:45 crc kubenswrapper[4774]: I1003 14:50:45.991879 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "7d110133-f0c7-4b4c-a060-5a0fdb950e9f" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:50:45 crc kubenswrapper[4774]: I1003 14:50:45.992302 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-kube-api-access-xlqr9" (OuterVolumeSpecName: "kube-api-access-xlqr9") pod "7d110133-f0c7-4b4c-a060-5a0fdb950e9f" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f"). InnerVolumeSpecName "kube-api-access-xlqr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:50:45 crc kubenswrapper[4774]: I1003 14:50:45.992435 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "7d110133-f0c7-4b4c-a060-5a0fdb950e9f" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:50:45 crc kubenswrapper[4774]: I1003 14:50:45.998204 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "7d110133-f0c7-4b4c-a060-5a0fdb950e9f" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:50:46 crc kubenswrapper[4774]: I1003 14:50:46.001049 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "7d110133-f0c7-4b4c-a060-5a0fdb950e9f" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 14:50:46 crc kubenswrapper[4774]: I1003 14:50:46.009200 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "7d110133-f0c7-4b4c-a060-5a0fdb950e9f" (UID: "7d110133-f0c7-4b4c-a060-5a0fdb950e9f"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:50:46 crc kubenswrapper[4774]: I1003 14:50:46.086585 4774 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 03 14:50:46 crc kubenswrapper[4774]: I1003 14:50:46.086631 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlqr9\" (UniqueName: \"kubernetes.io/projected/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-kube-api-access-xlqr9\") on node \"crc\" DevicePath \"\"" Oct 03 14:50:46 crc kubenswrapper[4774]: I1003 14:50:46.086645 4774 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 03 14:50:46 crc kubenswrapper[4774]: I1003 14:50:46.086656 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:50:46 crc kubenswrapper[4774]: I1003 14:50:46.086667 4774 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 03 14:50:46 crc kubenswrapper[4774]: I1003 14:50:46.086678 4774 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 14:50:46 crc kubenswrapper[4774]: I1003 14:50:46.086688 4774 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d110133-f0c7-4b4c-a060-5a0fdb950e9f-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 03 14:50:46 crc kubenswrapper[4774]: I1003 14:50:46.502698 4774 generic.go:334] "Generic (PLEG): container finished" podID="7d110133-f0c7-4b4c-a060-5a0fdb950e9f" containerID="f1f700616a1ad94d60892ac87f217e84ce5a9812c3fcc3832fd740a1c6f1e7f2" exitCode=0 Oct 03 14:50:46 crc kubenswrapper[4774]: I1003 14:50:46.502747 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" event={"ID":"7d110133-f0c7-4b4c-a060-5a0fdb950e9f","Type":"ContainerDied","Data":"f1f700616a1ad94d60892ac87f217e84ce5a9812c3fcc3832fd740a1c6f1e7f2"} Oct 03 14:50:46 crc kubenswrapper[4774]: I1003 14:50:46.502772 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" event={"ID":"7d110133-f0c7-4b4c-a060-5a0fdb950e9f","Type":"ContainerDied","Data":"ca2de3af8dd4f1cc3648cdebc623ca2347d2e7f6c54584aec12e2a45da3a95d3"} Oct 03 14:50:46 crc kubenswrapper[4774]: I1003 14:50:46.502788 4774 scope.go:117] "RemoveContainer" containerID="f1f700616a1ad94d60892ac87f217e84ce5a9812c3fcc3832fd740a1c6f1e7f2" Oct 03 14:50:46 crc kubenswrapper[4774]: I1003 14:50:46.502842 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4sjxw" Oct 03 14:50:46 crc kubenswrapper[4774]: I1003 14:50:46.519357 4774 scope.go:117] "RemoveContainer" containerID="f1f700616a1ad94d60892ac87f217e84ce5a9812c3fcc3832fd740a1c6f1e7f2" Oct 03 14:50:46 crc kubenswrapper[4774]: E1003 14:50:46.520101 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1f700616a1ad94d60892ac87f217e84ce5a9812c3fcc3832fd740a1c6f1e7f2\": container with ID starting with f1f700616a1ad94d60892ac87f217e84ce5a9812c3fcc3832fd740a1c6f1e7f2 not found: ID does not exist" containerID="f1f700616a1ad94d60892ac87f217e84ce5a9812c3fcc3832fd740a1c6f1e7f2" Oct 03 14:50:46 crc kubenswrapper[4774]: I1003 14:50:46.520177 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1f700616a1ad94d60892ac87f217e84ce5a9812c3fcc3832fd740a1c6f1e7f2"} err="failed to get container status \"f1f700616a1ad94d60892ac87f217e84ce5a9812c3fcc3832fd740a1c6f1e7f2\": rpc error: code = NotFound desc = could not find container \"f1f700616a1ad94d60892ac87f217e84ce5a9812c3fcc3832fd740a1c6f1e7f2\": container with ID starting with f1f700616a1ad94d60892ac87f217e84ce5a9812c3fcc3832fd740a1c6f1e7f2 not found: ID does not exist" Oct 03 14:50:46 crc kubenswrapper[4774]: I1003 14:50:46.544589 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4sjxw"] Oct 03 14:50:46 crc kubenswrapper[4774]: I1003 14:50:46.547505 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4sjxw"] Oct 03 14:50:47 crc kubenswrapper[4774]: I1003 14:50:47.307018 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d110133-f0c7-4b4c-a060-5a0fdb950e9f" path="/var/lib/kubelet/pods/7d110133-f0c7-4b4c-a060-5a0fdb950e9f/volumes" Oct 03 14:52:20 crc kubenswrapper[4774]: I1003 14:52:20.653832 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:52:20 crc kubenswrapper[4774]: I1003 14:52:20.654593 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:52:50 crc kubenswrapper[4774]: I1003 14:52:50.653935 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:52:50 crc kubenswrapper[4774]: I1003 14:52:50.654650 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:53:20 crc kubenswrapper[4774]: I1003 14:53:20.654142 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:53:20 crc kubenswrapper[4774]: I1003 14:53:20.654981 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:53:20 crc kubenswrapper[4774]: I1003 14:53:20.655050 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" Oct 03 14:53:20 crc kubenswrapper[4774]: I1003 14:53:20.655929 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6084213ff4d2e0141161ed1ee2c6ddcac4069da8f5bf554f9b528002b00a69ce"} pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 14:53:20 crc kubenswrapper[4774]: I1003 14:53:20.656033 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" containerID="cri-o://6084213ff4d2e0141161ed1ee2c6ddcac4069da8f5bf554f9b528002b00a69ce" gracePeriod=600 Oct 03 14:53:21 crc kubenswrapper[4774]: I1003 14:53:21.410630 4774 generic.go:334] "Generic (PLEG): container finished" podID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerID="6084213ff4d2e0141161ed1ee2c6ddcac4069da8f5bf554f9b528002b00a69ce" exitCode=0 Oct 03 14:53:21 crc kubenswrapper[4774]: I1003 14:53:21.410691 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" event={"ID":"ca37ac4b-f421-4198-a179-12901d36f0f5","Type":"ContainerDied","Data":"6084213ff4d2e0141161ed1ee2c6ddcac4069da8f5bf554f9b528002b00a69ce"} Oct 03 14:53:21 crc kubenswrapper[4774]: I1003 14:53:21.410989 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" event={"ID":"ca37ac4b-f421-4198-a179-12901d36f0f5","Type":"ContainerStarted","Data":"dd800268763caf1ba49e9f09998c3c8de0daa9481a442c7e1127db9996ab98ab"} Oct 03 14:53:21 crc kubenswrapper[4774]: I1003 14:53:21.411011 4774 scope.go:117] "RemoveContainer" containerID="c5ad343ebc4fbb79d9c25d28242be0ed044b9a7ef63c6a844189fb94cda2175a" Oct 03 14:53:38 crc kubenswrapper[4774]: I1003 14:53:38.570613 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-gghpw"] Oct 03 14:53:38 crc kubenswrapper[4774]: E1003 14:53:38.571357 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d110133-f0c7-4b4c-a060-5a0fdb950e9f" containerName="registry" Oct 03 14:53:38 crc kubenswrapper[4774]: I1003 14:53:38.571385 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d110133-f0c7-4b4c-a060-5a0fdb950e9f" containerName="registry" Oct 03 14:53:38 crc kubenswrapper[4774]: I1003 14:53:38.571485 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d110133-f0c7-4b4c-a060-5a0fdb950e9f" containerName="registry" Oct 03 14:53:38 crc kubenswrapper[4774]: I1003 14:53:38.571867 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-gghpw" Oct 03 14:53:38 crc kubenswrapper[4774]: I1003 14:53:38.577603 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-h2fts"] Oct 03 14:53:38 crc kubenswrapper[4774]: I1003 14:53:38.578427 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-h2fts" Oct 03 14:53:38 crc kubenswrapper[4774]: I1003 14:53:38.579387 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 03 14:53:38 crc kubenswrapper[4774]: I1003 14:53:38.579714 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 03 14:53:38 crc kubenswrapper[4774]: I1003 14:53:38.582662 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-hggn6"] Oct 03 14:53:38 crc kubenswrapper[4774]: I1003 14:53:38.583943 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-hggn6" Oct 03 14:53:38 crc kubenswrapper[4774]: I1003 14:53:38.584672 4774 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-g4s9k" Oct 03 14:53:38 crc kubenswrapper[4774]: I1003 14:53:38.584911 4774 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-9n2zr" Oct 03 14:53:38 crc kubenswrapper[4774]: I1003 14:53:38.588831 4774 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-gfvn5" Oct 03 14:53:38 crc kubenswrapper[4774]: I1003 14:53:38.590649 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-gghpw"] Oct 03 14:53:38 crc kubenswrapper[4774]: I1003 14:53:38.598850 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-hggn6"] Oct 03 14:53:38 crc kubenswrapper[4774]: I1003 14:53:38.603499 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-h2fts"] Oct 03 14:53:38 crc kubenswrapper[4774]: I1003 14:53:38.752292 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5q26\" (UniqueName: \"kubernetes.io/projected/1f99341f-994f-496c-9287-f0fa80429b74-kube-api-access-l5q26\") pod \"cert-manager-cainjector-7f985d654d-gghpw\" (UID: \"1f99341f-994f-496c-9287-f0fa80429b74\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-gghpw" Oct 03 14:53:38 crc kubenswrapper[4774]: I1003 14:53:38.752935 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t7cv\" (UniqueName: \"kubernetes.io/projected/769d7391-5628-4bcd-af8d-accf8b37c400-kube-api-access-9t7cv\") pod \"cert-manager-5b446d88c5-h2fts\" (UID: \"769d7391-5628-4bcd-af8d-accf8b37c400\") " pod="cert-manager/cert-manager-5b446d88c5-h2fts" Oct 03 14:53:38 crc kubenswrapper[4774]: I1003 14:53:38.753055 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvbpx\" (UniqueName: \"kubernetes.io/projected/3d3ba37a-f1af-431c-a733-19283fc5c055-kube-api-access-pvbpx\") pod \"cert-manager-webhook-5655c58dd6-hggn6\" (UID: \"3d3ba37a-f1af-431c-a733-19283fc5c055\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-hggn6" Oct 03 14:53:38 crc kubenswrapper[4774]: I1003 14:53:38.854079 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvbpx\" (UniqueName: \"kubernetes.io/projected/3d3ba37a-f1af-431c-a733-19283fc5c055-kube-api-access-pvbpx\") pod \"cert-manager-webhook-5655c58dd6-hggn6\" (UID: \"3d3ba37a-f1af-431c-a733-19283fc5c055\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-hggn6" Oct 03 14:53:38 crc kubenswrapper[4774]: I1003 14:53:38.854139 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5q26\" (UniqueName: \"kubernetes.io/projected/1f99341f-994f-496c-9287-f0fa80429b74-kube-api-access-l5q26\") pod \"cert-manager-cainjector-7f985d654d-gghpw\" (UID: \"1f99341f-994f-496c-9287-f0fa80429b74\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-gghpw" Oct 03 14:53:38 crc kubenswrapper[4774]: I1003 14:53:38.854192 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t7cv\" (UniqueName: \"kubernetes.io/projected/769d7391-5628-4bcd-af8d-accf8b37c400-kube-api-access-9t7cv\") pod \"cert-manager-5b446d88c5-h2fts\" (UID: \"769d7391-5628-4bcd-af8d-accf8b37c400\") " pod="cert-manager/cert-manager-5b446d88c5-h2fts" Oct 03 14:53:38 crc kubenswrapper[4774]: I1003 14:53:38.876737 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5q26\" (UniqueName: \"kubernetes.io/projected/1f99341f-994f-496c-9287-f0fa80429b74-kube-api-access-l5q26\") pod \"cert-manager-cainjector-7f985d654d-gghpw\" (UID: \"1f99341f-994f-496c-9287-f0fa80429b74\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-gghpw" Oct 03 14:53:38 crc kubenswrapper[4774]: I1003 14:53:38.877090 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t7cv\" (UniqueName: \"kubernetes.io/projected/769d7391-5628-4bcd-af8d-accf8b37c400-kube-api-access-9t7cv\") pod \"cert-manager-5b446d88c5-h2fts\" (UID: \"769d7391-5628-4bcd-af8d-accf8b37c400\") " pod="cert-manager/cert-manager-5b446d88c5-h2fts" Oct 03 14:53:38 crc kubenswrapper[4774]: I1003 14:53:38.877574 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvbpx\" (UniqueName: \"kubernetes.io/projected/3d3ba37a-f1af-431c-a733-19283fc5c055-kube-api-access-pvbpx\") pod \"cert-manager-webhook-5655c58dd6-hggn6\" (UID: \"3d3ba37a-f1af-431c-a733-19283fc5c055\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-hggn6" Oct 03 14:53:38 crc kubenswrapper[4774]: I1003 14:53:38.893203 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-gghpw" Oct 03 14:53:38 crc kubenswrapper[4774]: I1003 14:53:38.905504 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-h2fts" Oct 03 14:53:38 crc kubenswrapper[4774]: I1003 14:53:38.913940 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-hggn6" Oct 03 14:53:39 crc kubenswrapper[4774]: I1003 14:53:39.115617 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-gghpw"] Oct 03 14:53:39 crc kubenswrapper[4774]: I1003 14:53:39.126834 4774 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 14:53:39 crc kubenswrapper[4774]: I1003 14:53:39.356026 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-h2fts"] Oct 03 14:53:39 crc kubenswrapper[4774]: W1003 14:53:39.360697 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod769d7391_5628_4bcd_af8d_accf8b37c400.slice/crio-1603d23625adf6f7c02387bd83c8e0b36063d900950ccc2be7eb67c3233cc403 WatchSource:0}: Error finding container 1603d23625adf6f7c02387bd83c8e0b36063d900950ccc2be7eb67c3233cc403: Status 404 returned error can't find the container with id 1603d23625adf6f7c02387bd83c8e0b36063d900950ccc2be7eb67c3233cc403 Oct 03 14:53:39 crc kubenswrapper[4774]: I1003 14:53:39.405188 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-hggn6"] Oct 03 14:53:39 crc kubenswrapper[4774]: W1003 14:53:39.412162 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d3ba37a_f1af_431c_a733_19283fc5c055.slice/crio-3dc099b3f4a7e4b238041e0a26f8456598cc962230125a10fa6646284f895cc7 WatchSource:0}: Error finding container 3dc099b3f4a7e4b238041e0a26f8456598cc962230125a10fa6646284f895cc7: Status 404 returned error can't find the container with id 3dc099b3f4a7e4b238041e0a26f8456598cc962230125a10fa6646284f895cc7 Oct 03 14:53:39 crc kubenswrapper[4774]: I1003 14:53:39.520987 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-h2fts" event={"ID":"769d7391-5628-4bcd-af8d-accf8b37c400","Type":"ContainerStarted","Data":"1603d23625adf6f7c02387bd83c8e0b36063d900950ccc2be7eb67c3233cc403"} Oct 03 14:53:39 crc kubenswrapper[4774]: I1003 14:53:39.522061 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-gghpw" event={"ID":"1f99341f-994f-496c-9287-f0fa80429b74","Type":"ContainerStarted","Data":"547cdbeeb8c915b608eb86007a5c17915044d3487ce2d2d0b5bd99b0df81d754"} Oct 03 14:53:39 crc kubenswrapper[4774]: I1003 14:53:39.523444 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-hggn6" event={"ID":"3d3ba37a-f1af-431c-a733-19283fc5c055","Type":"ContainerStarted","Data":"3dc099b3f4a7e4b238041e0a26f8456598cc962230125a10fa6646284f895cc7"} Oct 03 14:53:44 crc kubenswrapper[4774]: I1003 14:53:44.555428 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-hggn6" event={"ID":"3d3ba37a-f1af-431c-a733-19283fc5c055","Type":"ContainerStarted","Data":"3c24388feb1133f087a28afab2ee0c1fa6f11fbaece8c7954d638f666edae017"} Oct 03 14:53:44 crc kubenswrapper[4774]: I1003 14:53:44.555877 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-hggn6" Oct 03 14:53:44 crc kubenswrapper[4774]: I1003 14:53:44.556893 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-h2fts" event={"ID":"769d7391-5628-4bcd-af8d-accf8b37c400","Type":"ContainerStarted","Data":"e98e9fd7805125a1d08528e3784d9d228580463d8d5dc6df7232d178e5d15f7f"} Oct 03 14:53:44 crc kubenswrapper[4774]: I1003 14:53:44.558771 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-gghpw" event={"ID":"1f99341f-994f-496c-9287-f0fa80429b74","Type":"ContainerStarted","Data":"8773375983ff1a9d26c1cc308e4a49da29cc2202d985b0c12aa461ea35f79182"} Oct 03 14:53:44 crc kubenswrapper[4774]: I1003 14:53:44.573517 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-hggn6" podStartSLOduration=2.246584428 podStartE2EDuration="6.573505042s" podCreationTimestamp="2025-10-03 14:53:38 +0000 UTC" firstStartedPulling="2025-10-03 14:53:39.415936102 +0000 UTC m=+642.005139554" lastFinishedPulling="2025-10-03 14:53:43.742856706 +0000 UTC m=+646.332060168" observedRunningTime="2025-10-03 14:53:44.569977994 +0000 UTC m=+647.159181486" watchObservedRunningTime="2025-10-03 14:53:44.573505042 +0000 UTC m=+647.162708494" Oct 03 14:53:44 crc kubenswrapper[4774]: I1003 14:53:44.592776 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-h2fts" podStartSLOduration=2.211027002 podStartE2EDuration="6.592761962s" podCreationTimestamp="2025-10-03 14:53:38 +0000 UTC" firstStartedPulling="2025-10-03 14:53:39.362460049 +0000 UTC m=+641.951663501" lastFinishedPulling="2025-10-03 14:53:43.744195009 +0000 UTC m=+646.333398461" observedRunningTime="2025-10-03 14:53:44.589639944 +0000 UTC m=+647.178843396" watchObservedRunningTime="2025-10-03 14:53:44.592761962 +0000 UTC m=+647.181965414" Oct 03 14:53:44 crc kubenswrapper[4774]: I1003 14:53:44.605680 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-gghpw" podStartSLOduration=1.927726673 podStartE2EDuration="6.605658704s" podCreationTimestamp="2025-10-03 14:53:38 +0000 UTC" firstStartedPulling="2025-10-03 14:53:39.126589013 +0000 UTC m=+641.715792465" lastFinishedPulling="2025-10-03 14:53:43.804521044 +0000 UTC m=+646.393724496" observedRunningTime="2025-10-03 14:53:44.603455589 +0000 UTC m=+647.192659041" watchObservedRunningTime="2025-10-03 14:53:44.605658704 +0000 UTC m=+647.194862156" Oct 03 14:53:48 crc kubenswrapper[4774]: I1003 14:53:48.918185 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-hggn6" Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.235490 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jzv75"] Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.235918 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="ovn-controller" containerID="cri-o://bc0d624981ca616d27057de09511f8a17b643686871e2e77c15bd1c65fba9c95" gracePeriod=30 Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.236048 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="northd" containerID="cri-o://8ae8d985dec947c332efa41abbc5c30a978f06942cc62a86ab45e90fc2e5d763" gracePeriod=30 Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.236097 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://b330d308f14c0b3b3630e89fe3e43728e17dbc3fe78cf9cae365a3ab0474c600" gracePeriod=30 Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.236135 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="kube-rbac-proxy-node" containerID="cri-o://fdd06fa447f9a388f7cabafaff05c8d6dc9aedc161d2097f81855318c8b7712e" gracePeriod=30 Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.236172 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="ovn-acl-logging" containerID="cri-o://aaa74c70ca8d93d672246f3451771bffa1e304c88a453cbbe3a305a304954fe7" gracePeriod=30 Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.236182 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="sbdb" containerID="cri-o://ac1b1bb4b30aa1d11558ff126ad87e4dd7b9aaf255cd4a1860aacdee30a0bb15" gracePeriod=30 Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.235925 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="nbdb" containerID="cri-o://b575a360d2ee43bc2629635a4ffda77a1b0e1a171bbd99b76b725784be537e9c" gracePeriod=30 Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.283876 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="ovnkube-controller" containerID="cri-o://3d421c202f552f64422b854f56305a823b6051acede4800cdb123010bdd2af47" gracePeriod=30 Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.586680 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jk5hb_4f2cc8dc-61c3-4a0b-8da3-b899094eaa53/kube-multus/2.log" Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.587166 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jk5hb_4f2cc8dc-61c3-4a0b-8da3-b899094eaa53/kube-multus/1.log" Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.587212 4774 generic.go:334] "Generic (PLEG): container finished" podID="4f2cc8dc-61c3-4a0b-8da3-b899094eaa53" containerID="ac62cac6bdedfb0b9adab5e07f0bd50c6fbda746776d2da997f7537cd0c44d2a" exitCode=2 Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.587268 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jk5hb" event={"ID":"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53","Type":"ContainerDied","Data":"ac62cac6bdedfb0b9adab5e07f0bd50c6fbda746776d2da997f7537cd0c44d2a"} Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.587308 4774 scope.go:117] "RemoveContainer" containerID="a17e01ed9b7f955272f2c5bb14a9624d7faeb0a4727698c093a2acc4e2da71af" Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.587820 4774 scope.go:117] "RemoveContainer" containerID="ac62cac6bdedfb0b9adab5e07f0bd50c6fbda746776d2da997f7537cd0c44d2a" Oct 03 14:53:49 crc kubenswrapper[4774]: E1003 14:53:49.588164 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-jk5hb_openshift-multus(4f2cc8dc-61c3-4a0b-8da3-b899094eaa53)\"" pod="openshift-multus/multus-jk5hb" podUID="4f2cc8dc-61c3-4a0b-8da3-b899094eaa53" Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.592969 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzv75_01bef0c3-23b3-4f49-8c33-3f2ec7503b12/ovnkube-controller/3.log" Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.596902 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzv75_01bef0c3-23b3-4f49-8c33-3f2ec7503b12/ovn-acl-logging/0.log" Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.597427 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzv75_01bef0c3-23b3-4f49-8c33-3f2ec7503b12/ovn-controller/0.log" Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.597948 4774 generic.go:334] "Generic (PLEG): container finished" podID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerID="3d421c202f552f64422b854f56305a823b6051acede4800cdb123010bdd2af47" exitCode=0 Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.597969 4774 generic.go:334] "Generic (PLEG): container finished" podID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerID="ac1b1bb4b30aa1d11558ff126ad87e4dd7b9aaf255cd4a1860aacdee30a0bb15" exitCode=0 Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.597979 4774 generic.go:334] "Generic (PLEG): container finished" podID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerID="b575a360d2ee43bc2629635a4ffda77a1b0e1a171bbd99b76b725784be537e9c" exitCode=0 Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.597987 4774 generic.go:334] "Generic (PLEG): container finished" podID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerID="8ae8d985dec947c332efa41abbc5c30a978f06942cc62a86ab45e90fc2e5d763" exitCode=0 Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.597996 4774 generic.go:334] "Generic (PLEG): container finished" podID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerID="b330d308f14c0b3b3630e89fe3e43728e17dbc3fe78cf9cae365a3ab0474c600" exitCode=0 Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.598005 4774 generic.go:334] "Generic (PLEG): container finished" podID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerID="fdd06fa447f9a388f7cabafaff05c8d6dc9aedc161d2097f81855318c8b7712e" exitCode=0 Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.598013 4774 generic.go:334] "Generic (PLEG): container finished" podID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerID="aaa74c70ca8d93d672246f3451771bffa1e304c88a453cbbe3a305a304954fe7" exitCode=143 Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.598021 4774 generic.go:334] "Generic (PLEG): container finished" podID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerID="bc0d624981ca616d27057de09511f8a17b643686871e2e77c15bd1c65fba9c95" exitCode=143 Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.598026 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" event={"ID":"01bef0c3-23b3-4f49-8c33-3f2ec7503b12","Type":"ContainerDied","Data":"3d421c202f552f64422b854f56305a823b6051acede4800cdb123010bdd2af47"} Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.598083 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" event={"ID":"01bef0c3-23b3-4f49-8c33-3f2ec7503b12","Type":"ContainerDied","Data":"ac1b1bb4b30aa1d11558ff126ad87e4dd7b9aaf255cd4a1860aacdee30a0bb15"} Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.598098 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" event={"ID":"01bef0c3-23b3-4f49-8c33-3f2ec7503b12","Type":"ContainerDied","Data":"b575a360d2ee43bc2629635a4ffda77a1b0e1a171bbd99b76b725784be537e9c"} Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.598128 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" event={"ID":"01bef0c3-23b3-4f49-8c33-3f2ec7503b12","Type":"ContainerDied","Data":"8ae8d985dec947c332efa41abbc5c30a978f06942cc62a86ab45e90fc2e5d763"} Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.598143 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" event={"ID":"01bef0c3-23b3-4f49-8c33-3f2ec7503b12","Type":"ContainerDied","Data":"b330d308f14c0b3b3630e89fe3e43728e17dbc3fe78cf9cae365a3ab0474c600"} Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.598153 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" event={"ID":"01bef0c3-23b3-4f49-8c33-3f2ec7503b12","Type":"ContainerDied","Data":"fdd06fa447f9a388f7cabafaff05c8d6dc9aedc161d2097f81855318c8b7712e"} Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.598163 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" event={"ID":"01bef0c3-23b3-4f49-8c33-3f2ec7503b12","Type":"ContainerDied","Data":"aaa74c70ca8d93d672246f3451771bffa1e304c88a453cbbe3a305a304954fe7"} Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.598193 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" event={"ID":"01bef0c3-23b3-4f49-8c33-3f2ec7503b12","Type":"ContainerDied","Data":"bc0d624981ca616d27057de09511f8a17b643686871e2e77c15bd1c65fba9c95"} Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.633051 4774 scope.go:117] "RemoveContainer" containerID="d5d7482646eb7f611642d40cc2246f4325e1fdc7d5b08eab0ac5777911a9b209" Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.983571 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzv75_01bef0c3-23b3-4f49-8c33-3f2ec7503b12/ovn-acl-logging/0.log" Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.984104 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzv75_01bef0c3-23b3-4f49-8c33-3f2ec7503b12/ovn-controller/0.log" Oct 03 14:53:49 crc kubenswrapper[4774]: I1003 14:53:49.984740 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.062684 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wndvx"] Oct 03 14:53:50 crc kubenswrapper[4774]: E1003 14:53:50.063017 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="northd" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.063047 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="northd" Oct 03 14:53:50 crc kubenswrapper[4774]: E1003 14:53:50.063062 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="ovnkube-controller" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.063076 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="ovnkube-controller" Oct 03 14:53:50 crc kubenswrapper[4774]: E1003 14:53:50.063094 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="sbdb" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.063111 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="sbdb" Oct 03 14:53:50 crc kubenswrapper[4774]: E1003 14:53:50.063132 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="kube-rbac-proxy-node" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.063145 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="kube-rbac-proxy-node" Oct 03 14:53:50 crc kubenswrapper[4774]: E1003 14:53:50.063164 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="ovn-controller" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.063176 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="ovn-controller" Oct 03 14:53:50 crc kubenswrapper[4774]: E1003 14:53:50.063194 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="kube-rbac-proxy-ovn-metrics" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.063207 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="kube-rbac-proxy-ovn-metrics" Oct 03 14:53:50 crc kubenswrapper[4774]: E1003 14:53:50.063224 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="ovnkube-controller" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.063237 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="ovnkube-controller" Oct 03 14:53:50 crc kubenswrapper[4774]: E1003 14:53:50.063254 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="ovn-acl-logging" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.063269 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="ovn-acl-logging" Oct 03 14:53:50 crc kubenswrapper[4774]: E1003 14:53:50.063284 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="kubecfg-setup" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.063297 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="kubecfg-setup" Oct 03 14:53:50 crc kubenswrapper[4774]: E1003 14:53:50.063317 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="nbdb" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.063330 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="nbdb" Oct 03 14:53:50 crc kubenswrapper[4774]: E1003 14:53:50.063350 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="ovnkube-controller" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.063363 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="ovnkube-controller" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.063559 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="sbdb" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.063576 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="ovnkube-controller" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.063596 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="ovn-controller" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.063618 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="ovnkube-controller" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.063637 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="nbdb" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.063656 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="kube-rbac-proxy-node" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.063670 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="ovnkube-controller" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.063684 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="ovn-acl-logging" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.063702 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="kube-rbac-proxy-ovn-metrics" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.063719 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="northd" Oct 03 14:53:50 crc kubenswrapper[4774]: E1003 14:53:50.063877 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="ovnkube-controller" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.063893 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="ovnkube-controller" Oct 03 14:53:50 crc kubenswrapper[4774]: E1003 14:53:50.063917 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="ovnkube-controller" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.063931 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="ovnkube-controller" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.064105 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="ovnkube-controller" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.064131 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" containerName="ovnkube-controller" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.067157 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.093759 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-run-ovn\") pod \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.094097 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "01bef0c3-23b3-4f49-8c33-3f2ec7503b12" (UID: "01bef0c3-23b3-4f49-8c33-3f2ec7503b12"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.094466 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-log-socket\") pod \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.094549 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-log-socket" (OuterVolumeSpecName: "log-socket") pod "01bef0c3-23b3-4f49-8c33-3f2ec7503b12" (UID: "01bef0c3-23b3-4f49-8c33-3f2ec7503b12"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.094738 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-ovnkube-config\") pod \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.094808 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-cni-netd\") pod \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.094853 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-ovnkube-script-lib\") pod \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.094903 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-etc-openvswitch\") pod \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.094963 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-run-systemd\") pod \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.094983 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "01bef0c3-23b3-4f49-8c33-3f2ec7503b12" (UID: "01bef0c3-23b3-4f49-8c33-3f2ec7503b12"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.095014 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-var-lib-cni-networks-ovn-kubernetes\") pod \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.095069 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-run-netns\") pod \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.095107 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-cni-bin\") pod \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.095150 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gp65q\" (UniqueName: \"kubernetes.io/projected/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-kube-api-access-gp65q\") pod \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.095188 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-run-ovn-kubernetes\") pod \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.095235 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-ovn-node-metrics-cert\") pod \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.095278 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-env-overrides\") pod \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.095323 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-run-openvswitch\") pod \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.095360 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-var-lib-openvswitch\") pod \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.095442 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-systemd-units\") pod \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.095486 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-kubelet\") pod \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.095523 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-slash\") pod \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.095585 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-node-log\") pod \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\" (UID: \"01bef0c3-23b3-4f49-8c33-3f2ec7503b12\") " Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.095782 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "01bef0c3-23b3-4f49-8c33-3f2ec7503b12" (UID: "01bef0c3-23b3-4f49-8c33-3f2ec7503b12"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.095793 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-var-lib-openvswitch\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.095874 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/21752396-49c5-467a-8a9b-3fea45b89794-ovn-node-metrics-cert\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.095932 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/21752396-49c5-467a-8a9b-3fea45b89794-ovnkube-script-lib\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.095992 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/21752396-49c5-467a-8a9b-3fea45b89794-ovnkube-config\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.096042 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-host-kubelet\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.096089 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-run-openvswitch\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.096135 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-run-ovn\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.096187 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjqwp\" (UniqueName: \"kubernetes.io/projected/21752396-49c5-467a-8a9b-3fea45b89794-kube-api-access-wjqwp\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.096231 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-host-run-ovn-kubernetes\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.096294 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-etc-openvswitch\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.096342 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-run-systemd\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.096423 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-host-slash\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.096481 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-log-socket\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.096527 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-systemd-units\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.096581 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-host-cni-bin\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.096636 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-host-run-netns\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.096659 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "01bef0c3-23b3-4f49-8c33-3f2ec7503b12" (UID: "01bef0c3-23b3-4f49-8c33-3f2ec7503b12"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.096678 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-node-log\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.096731 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.096795 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/21752396-49c5-467a-8a9b-3fea45b89794-env-overrides\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.097090 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-host-cni-netd\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.097169 4774 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.097195 4774 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-log-socket\") on node \"crc\" DevicePath \"\"" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.097217 4774 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.097242 4774 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.097265 4774 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.096733 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "01bef0c3-23b3-4f49-8c33-3f2ec7503b12" (UID: "01bef0c3-23b3-4f49-8c33-3f2ec7503b12"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.098048 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "01bef0c3-23b3-4f49-8c33-3f2ec7503b12" (UID: "01bef0c3-23b3-4f49-8c33-3f2ec7503b12"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.098051 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "01bef0c3-23b3-4f49-8c33-3f2ec7503b12" (UID: "01bef0c3-23b3-4f49-8c33-3f2ec7503b12"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.098175 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "01bef0c3-23b3-4f49-8c33-3f2ec7503b12" (UID: "01bef0c3-23b3-4f49-8c33-3f2ec7503b12"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.098181 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "01bef0c3-23b3-4f49-8c33-3f2ec7503b12" (UID: "01bef0c3-23b3-4f49-8c33-3f2ec7503b12"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.098230 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-slash" (OuterVolumeSpecName: "host-slash") pod "01bef0c3-23b3-4f49-8c33-3f2ec7503b12" (UID: "01bef0c3-23b3-4f49-8c33-3f2ec7503b12"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.098191 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "01bef0c3-23b3-4f49-8c33-3f2ec7503b12" (UID: "01bef0c3-23b3-4f49-8c33-3f2ec7503b12"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.098238 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-node-log" (OuterVolumeSpecName: "node-log") pod "01bef0c3-23b3-4f49-8c33-3f2ec7503b12" (UID: "01bef0c3-23b3-4f49-8c33-3f2ec7503b12"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.098259 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "01bef0c3-23b3-4f49-8c33-3f2ec7503b12" (UID: "01bef0c3-23b3-4f49-8c33-3f2ec7503b12"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.098100 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "01bef0c3-23b3-4f49-8c33-3f2ec7503b12" (UID: "01bef0c3-23b3-4f49-8c33-3f2ec7503b12"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.098257 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "01bef0c3-23b3-4f49-8c33-3f2ec7503b12" (UID: "01bef0c3-23b3-4f49-8c33-3f2ec7503b12"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.098888 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "01bef0c3-23b3-4f49-8c33-3f2ec7503b12" (UID: "01bef0c3-23b3-4f49-8c33-3f2ec7503b12"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.106254 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-kube-api-access-gp65q" (OuterVolumeSpecName: "kube-api-access-gp65q") pod "01bef0c3-23b3-4f49-8c33-3f2ec7503b12" (UID: "01bef0c3-23b3-4f49-8c33-3f2ec7503b12"). InnerVolumeSpecName "kube-api-access-gp65q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.107034 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "01bef0c3-23b3-4f49-8c33-3f2ec7503b12" (UID: "01bef0c3-23b3-4f49-8c33-3f2ec7503b12"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.122890 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "01bef0c3-23b3-4f49-8c33-3f2ec7503b12" (UID: "01bef0c3-23b3-4f49-8c33-3f2ec7503b12"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.197715 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/21752396-49c5-467a-8a9b-3fea45b89794-ovnkube-config\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.197762 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-host-kubelet\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.197778 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-run-openvswitch\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.197793 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-run-ovn\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.197813 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-host-run-ovn-kubernetes\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.197827 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjqwp\" (UniqueName: \"kubernetes.io/projected/21752396-49c5-467a-8a9b-3fea45b89794-kube-api-access-wjqwp\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.197848 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-etc-openvswitch\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.197865 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-run-systemd\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.197879 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-host-slash\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.197895 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-log-socket\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.197911 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-systemd-units\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.197926 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-host-cni-bin\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.197943 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-host-run-netns\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.197959 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-node-log\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.197976 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.197994 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/21752396-49c5-467a-8a9b-3fea45b89794-env-overrides\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.198015 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-host-cni-netd\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.198038 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/21752396-49c5-467a-8a9b-3fea45b89794-ovn-node-metrics-cert\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.198055 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-var-lib-openvswitch\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.198067 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/21752396-49c5-467a-8a9b-3fea45b89794-ovnkube-script-lib\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.198104 4774 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.198114 4774 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.198123 4774 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.198134 4774 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.198142 4774 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.198151 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gp65q\" (UniqueName: \"kubernetes.io/projected/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-kube-api-access-gp65q\") on node \"crc\" DevicePath \"\"" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.198160 4774 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.198169 4774 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.198176 4774 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.198184 4774 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.198192 4774 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.198200 4774 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.198209 4774 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.198216 4774 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-host-slash\") on node \"crc\" DevicePath \"\"" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.198224 4774 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/01bef0c3-23b3-4f49-8c33-3f2ec7503b12-node-log\") on node \"crc\" DevicePath \"\"" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.198529 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-log-socket\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.198572 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-host-run-ovn-kubernetes\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.198602 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-etc-openvswitch\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.198612 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-host-cni-bin\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.198572 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.198631 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-run-systemd\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.198654 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-host-run-netns\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.198718 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-host-slash\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.198745 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-run-openvswitch\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.198762 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-var-lib-openvswitch\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.198775 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-systemd-units\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.198789 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-node-log\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.198782 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-host-kubelet\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.198783 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-host-cni-netd\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.198822 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/21752396-49c5-467a-8a9b-3fea45b89794-run-ovn\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.199974 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/21752396-49c5-467a-8a9b-3fea45b89794-env-overrides\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.200053 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/21752396-49c5-467a-8a9b-3fea45b89794-ovnkube-config\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.200250 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/21752396-49c5-467a-8a9b-3fea45b89794-ovnkube-script-lib\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.202703 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/21752396-49c5-467a-8a9b-3fea45b89794-ovn-node-metrics-cert\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.218444 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjqwp\" (UniqueName: \"kubernetes.io/projected/21752396-49c5-467a-8a9b-3fea45b89794-kube-api-access-wjqwp\") pod \"ovnkube-node-wndvx\" (UID: \"21752396-49c5-467a-8a9b-3fea45b89794\") " pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.385605 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.604309 4774 generic.go:334] "Generic (PLEG): container finished" podID="21752396-49c5-467a-8a9b-3fea45b89794" containerID="a5761257a1cbb5deb6700ed58718f0f87a69889bfdaaaf169efe1ad4f05401a9" exitCode=0 Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.604437 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" event={"ID":"21752396-49c5-467a-8a9b-3fea45b89794","Type":"ContainerDied","Data":"a5761257a1cbb5deb6700ed58718f0f87a69889bfdaaaf169efe1ad4f05401a9"} Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.604469 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" event={"ID":"21752396-49c5-467a-8a9b-3fea45b89794","Type":"ContainerStarted","Data":"ecc2ffc36268b4a56e375e9224ac669250f9e0416b43d0a0ef355e157398ad63"} Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.606953 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jk5hb_4f2cc8dc-61c3-4a0b-8da3-b899094eaa53/kube-multus/2.log" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.618400 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzv75_01bef0c3-23b3-4f49-8c33-3f2ec7503b12/ovn-acl-logging/0.log" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.618889 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzv75_01bef0c3-23b3-4f49-8c33-3f2ec7503b12/ovn-controller/0.log" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.619288 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" event={"ID":"01bef0c3-23b3-4f49-8c33-3f2ec7503b12","Type":"ContainerDied","Data":"c515b30d898275d86790d1d37cb131444b0cb4a19d897d79dccb46b7ed13ae4b"} Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.619323 4774 scope.go:117] "RemoveContainer" containerID="3d421c202f552f64422b854f56305a823b6051acede4800cdb123010bdd2af47" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.619446 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jzv75" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.708719 4774 scope.go:117] "RemoveContainer" containerID="ac1b1bb4b30aa1d11558ff126ad87e4dd7b9aaf255cd4a1860aacdee30a0bb15" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.739829 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jzv75"] Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.742576 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jzv75"] Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.752088 4774 scope.go:117] "RemoveContainer" containerID="b575a360d2ee43bc2629635a4ffda77a1b0e1a171bbd99b76b725784be537e9c" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.765408 4774 scope.go:117] "RemoveContainer" containerID="8ae8d985dec947c332efa41abbc5c30a978f06942cc62a86ab45e90fc2e5d763" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.777515 4774 scope.go:117] "RemoveContainer" containerID="b330d308f14c0b3b3630e89fe3e43728e17dbc3fe78cf9cae365a3ab0474c600" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.790084 4774 scope.go:117] "RemoveContainer" containerID="fdd06fa447f9a388f7cabafaff05c8d6dc9aedc161d2097f81855318c8b7712e" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.801657 4774 scope.go:117] "RemoveContainer" containerID="aaa74c70ca8d93d672246f3451771bffa1e304c88a453cbbe3a305a304954fe7" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.812689 4774 scope.go:117] "RemoveContainer" containerID="bc0d624981ca616d27057de09511f8a17b643686871e2e77c15bd1c65fba9c95" Oct 03 14:53:50 crc kubenswrapper[4774]: I1003 14:53:50.832872 4774 scope.go:117] "RemoveContainer" containerID="583beceb4b20a27948874d13b00bd03fc24c94abfc0bcccec417845822583a5b" Oct 03 14:53:51 crc kubenswrapper[4774]: I1003 14:53:51.312662 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01bef0c3-23b3-4f49-8c33-3f2ec7503b12" path="/var/lib/kubelet/pods/01bef0c3-23b3-4f49-8c33-3f2ec7503b12/volumes" Oct 03 14:53:51 crc kubenswrapper[4774]: I1003 14:53:51.632458 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" event={"ID":"21752396-49c5-467a-8a9b-3fea45b89794","Type":"ContainerStarted","Data":"f9d501985251c30c02a4a7a5629844d8c0bd1aa695a35172ce738d81c726bcb9"} Oct 03 14:53:51 crc kubenswrapper[4774]: I1003 14:53:51.632529 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" event={"ID":"21752396-49c5-467a-8a9b-3fea45b89794","Type":"ContainerStarted","Data":"d9fed01d5a348e3d072d9c41b8e89c3cf05152e7f15dd2d1966a17f99f401892"} Oct 03 14:53:51 crc kubenswrapper[4774]: I1003 14:53:51.632558 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" event={"ID":"21752396-49c5-467a-8a9b-3fea45b89794","Type":"ContainerStarted","Data":"96819d9d38a636fa4ae1d747dbeeb0202aeaacb95df61b3ecde0db024e603c6a"} Oct 03 14:53:51 crc kubenswrapper[4774]: I1003 14:53:51.632586 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" event={"ID":"21752396-49c5-467a-8a9b-3fea45b89794","Type":"ContainerStarted","Data":"51f22ef4bb5f92007d61cb80531a1824e76804a19b4cab09a831278bb2b361cb"} Oct 03 14:53:51 crc kubenswrapper[4774]: I1003 14:53:51.632606 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" event={"ID":"21752396-49c5-467a-8a9b-3fea45b89794","Type":"ContainerStarted","Data":"396a8a87b54ee7b0b0d7fb03819a2418b00f46f7e621a7152b1b56ef7f72e0ca"} Oct 03 14:53:51 crc kubenswrapper[4774]: I1003 14:53:51.632626 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" event={"ID":"21752396-49c5-467a-8a9b-3fea45b89794","Type":"ContainerStarted","Data":"32680af05c5946fa5bf641084d288017d7e7ccc213e39560cb8b22568034a602"} Oct 03 14:53:54 crc kubenswrapper[4774]: I1003 14:53:54.665031 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" event={"ID":"21752396-49c5-467a-8a9b-3fea45b89794","Type":"ContainerStarted","Data":"7b569bdf23b26646bd308b377303e47c8fa57bcc44ed9e6f0df328622decb7bc"} Oct 03 14:53:56 crc kubenswrapper[4774]: I1003 14:53:56.678547 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" event={"ID":"21752396-49c5-467a-8a9b-3fea45b89794","Type":"ContainerStarted","Data":"f4615a3d926beb3e9e10f569fa24c9d707771ebd4f21acd1998d260f309aed54"} Oct 03 14:53:56 crc kubenswrapper[4774]: I1003 14:53:56.679167 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:56 crc kubenswrapper[4774]: I1003 14:53:56.679281 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:56 crc kubenswrapper[4774]: I1003 14:53:56.679296 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:56 crc kubenswrapper[4774]: I1003 14:53:56.709211 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" podStartSLOduration=6.709186054 podStartE2EDuration="6.709186054s" podCreationTimestamp="2025-10-03 14:53:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:53:56.70619177 +0000 UTC m=+659.295395242" watchObservedRunningTime="2025-10-03 14:53:56.709186054 +0000 UTC m=+659.298389546" Oct 03 14:53:56 crc kubenswrapper[4774]: I1003 14:53:56.714869 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:53:56 crc kubenswrapper[4774]: I1003 14:53:56.730060 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:54:04 crc kubenswrapper[4774]: I1003 14:54:04.299609 4774 scope.go:117] "RemoveContainer" containerID="ac62cac6bdedfb0b9adab5e07f0bd50c6fbda746776d2da997f7537cd0c44d2a" Oct 03 14:54:04 crc kubenswrapper[4774]: E1003 14:54:04.300319 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-jk5hb_openshift-multus(4f2cc8dc-61c3-4a0b-8da3-b899094eaa53)\"" pod="openshift-multus/multus-jk5hb" podUID="4f2cc8dc-61c3-4a0b-8da3-b899094eaa53" Oct 03 14:54:15 crc kubenswrapper[4774]: I1003 14:54:15.299675 4774 scope.go:117] "RemoveContainer" containerID="ac62cac6bdedfb0b9adab5e07f0bd50c6fbda746776d2da997f7537cd0c44d2a" Oct 03 14:54:15 crc kubenswrapper[4774]: I1003 14:54:15.799160 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jk5hb_4f2cc8dc-61c3-4a0b-8da3-b899094eaa53/kube-multus/2.log" Oct 03 14:54:15 crc kubenswrapper[4774]: I1003 14:54:15.799420 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jk5hb" event={"ID":"4f2cc8dc-61c3-4a0b-8da3-b899094eaa53","Type":"ContainerStarted","Data":"af9e86466ad017d98e1087b56d393e0ff64d524edb4aaaae58aa03dfde5d5e2c"} Oct 03 14:54:20 crc kubenswrapper[4774]: I1003 14:54:20.418472 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wndvx" Oct 03 14:54:27 crc kubenswrapper[4774]: I1003 14:54:27.612913 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm"] Oct 03 14:54:27 crc kubenswrapper[4774]: I1003 14:54:27.614394 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm" Oct 03 14:54:27 crc kubenswrapper[4774]: I1003 14:54:27.616529 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 03 14:54:27 crc kubenswrapper[4774]: I1003 14:54:27.625093 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm"] Oct 03 14:54:27 crc kubenswrapper[4774]: I1003 14:54:27.715816 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7845d8dc-4399-4450-b1d6-d424a8d64539-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm\" (UID: \"7845d8dc-4399-4450-b1d6-d424a8d64539\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm" Oct 03 14:54:27 crc kubenswrapper[4774]: I1003 14:54:27.716048 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kfcq\" (UniqueName: \"kubernetes.io/projected/7845d8dc-4399-4450-b1d6-d424a8d64539-kube-api-access-7kfcq\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm\" (UID: \"7845d8dc-4399-4450-b1d6-d424a8d64539\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm" Oct 03 14:54:27 crc kubenswrapper[4774]: I1003 14:54:27.716235 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7845d8dc-4399-4450-b1d6-d424a8d64539-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm\" (UID: \"7845d8dc-4399-4450-b1d6-d424a8d64539\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm" Oct 03 14:54:27 crc kubenswrapper[4774]: I1003 14:54:27.818023 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kfcq\" (UniqueName: \"kubernetes.io/projected/7845d8dc-4399-4450-b1d6-d424a8d64539-kube-api-access-7kfcq\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm\" (UID: \"7845d8dc-4399-4450-b1d6-d424a8d64539\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm" Oct 03 14:54:27 crc kubenswrapper[4774]: I1003 14:54:27.818126 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7845d8dc-4399-4450-b1d6-d424a8d64539-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm\" (UID: \"7845d8dc-4399-4450-b1d6-d424a8d64539\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm" Oct 03 14:54:27 crc kubenswrapper[4774]: I1003 14:54:27.818203 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7845d8dc-4399-4450-b1d6-d424a8d64539-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm\" (UID: \"7845d8dc-4399-4450-b1d6-d424a8d64539\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm" Oct 03 14:54:27 crc kubenswrapper[4774]: I1003 14:54:27.818703 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7845d8dc-4399-4450-b1d6-d424a8d64539-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm\" (UID: \"7845d8dc-4399-4450-b1d6-d424a8d64539\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm" Oct 03 14:54:27 crc kubenswrapper[4774]: I1003 14:54:27.818882 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7845d8dc-4399-4450-b1d6-d424a8d64539-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm\" (UID: \"7845d8dc-4399-4450-b1d6-d424a8d64539\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm" Oct 03 14:54:27 crc kubenswrapper[4774]: I1003 14:54:27.843098 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kfcq\" (UniqueName: \"kubernetes.io/projected/7845d8dc-4399-4450-b1d6-d424a8d64539-kube-api-access-7kfcq\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm\" (UID: \"7845d8dc-4399-4450-b1d6-d424a8d64539\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm" Oct 03 14:54:27 crc kubenswrapper[4774]: I1003 14:54:27.930814 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm" Oct 03 14:54:28 crc kubenswrapper[4774]: I1003 14:54:28.377900 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm"] Oct 03 14:54:28 crc kubenswrapper[4774]: W1003 14:54:28.387869 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7845d8dc_4399_4450_b1d6_d424a8d64539.slice/crio-7644cbe7ef2c7718871addcdd340dd50845f6361a904fdf1a74798203f8fbe4a WatchSource:0}: Error finding container 7644cbe7ef2c7718871addcdd340dd50845f6361a904fdf1a74798203f8fbe4a: Status 404 returned error can't find the container with id 7644cbe7ef2c7718871addcdd340dd50845f6361a904fdf1a74798203f8fbe4a Oct 03 14:54:28 crc kubenswrapper[4774]: I1003 14:54:28.893724 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm" event={"ID":"7845d8dc-4399-4450-b1d6-d424a8d64539","Type":"ContainerStarted","Data":"b5bff735ea69ccf62c4156f1e88423f3e92ce4d8873e1d9b7b5ce043420e3846"} Oct 03 14:54:28 crc kubenswrapper[4774]: I1003 14:54:28.894101 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm" event={"ID":"7845d8dc-4399-4450-b1d6-d424a8d64539","Type":"ContainerStarted","Data":"7644cbe7ef2c7718871addcdd340dd50845f6361a904fdf1a74798203f8fbe4a"} Oct 03 14:54:29 crc kubenswrapper[4774]: I1003 14:54:29.902698 4774 generic.go:334] "Generic (PLEG): container finished" podID="7845d8dc-4399-4450-b1d6-d424a8d64539" containerID="b5bff735ea69ccf62c4156f1e88423f3e92ce4d8873e1d9b7b5ce043420e3846" exitCode=0 Oct 03 14:54:29 crc kubenswrapper[4774]: I1003 14:54:29.902745 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm" event={"ID":"7845d8dc-4399-4450-b1d6-d424a8d64539","Type":"ContainerDied","Data":"b5bff735ea69ccf62c4156f1e88423f3e92ce4d8873e1d9b7b5ce043420e3846"} Oct 03 14:54:31 crc kubenswrapper[4774]: I1003 14:54:31.918565 4774 generic.go:334] "Generic (PLEG): container finished" podID="7845d8dc-4399-4450-b1d6-d424a8d64539" containerID="44be2178b5f4c7ddf27aee6049774e669e8a035ff1aad415747177644ae7cd83" exitCode=0 Oct 03 14:54:31 crc kubenswrapper[4774]: I1003 14:54:31.918646 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm" event={"ID":"7845d8dc-4399-4450-b1d6-d424a8d64539","Type":"ContainerDied","Data":"44be2178b5f4c7ddf27aee6049774e669e8a035ff1aad415747177644ae7cd83"} Oct 03 14:54:32 crc kubenswrapper[4774]: I1003 14:54:32.930195 4774 generic.go:334] "Generic (PLEG): container finished" podID="7845d8dc-4399-4450-b1d6-d424a8d64539" containerID="84610b76fa3a6a1d05935f8b0156baca942905647875b689e23764d9f6531606" exitCode=0 Oct 03 14:54:32 crc kubenswrapper[4774]: I1003 14:54:32.930241 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm" event={"ID":"7845d8dc-4399-4450-b1d6-d424a8d64539","Type":"ContainerDied","Data":"84610b76fa3a6a1d05935f8b0156baca942905647875b689e23764d9f6531606"} Oct 03 14:54:34 crc kubenswrapper[4774]: I1003 14:54:34.171577 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm" Oct 03 14:54:34 crc kubenswrapper[4774]: I1003 14:54:34.205217 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7845d8dc-4399-4450-b1d6-d424a8d64539-bundle\") pod \"7845d8dc-4399-4450-b1d6-d424a8d64539\" (UID: \"7845d8dc-4399-4450-b1d6-d424a8d64539\") " Oct 03 14:54:34 crc kubenswrapper[4774]: I1003 14:54:34.205360 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kfcq\" (UniqueName: \"kubernetes.io/projected/7845d8dc-4399-4450-b1d6-d424a8d64539-kube-api-access-7kfcq\") pod \"7845d8dc-4399-4450-b1d6-d424a8d64539\" (UID: \"7845d8dc-4399-4450-b1d6-d424a8d64539\") " Oct 03 14:54:34 crc kubenswrapper[4774]: I1003 14:54:34.205442 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7845d8dc-4399-4450-b1d6-d424a8d64539-util\") pod \"7845d8dc-4399-4450-b1d6-d424a8d64539\" (UID: \"7845d8dc-4399-4450-b1d6-d424a8d64539\") " Oct 03 14:54:34 crc kubenswrapper[4774]: I1003 14:54:34.206090 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7845d8dc-4399-4450-b1d6-d424a8d64539-bundle" (OuterVolumeSpecName: "bundle") pod "7845d8dc-4399-4450-b1d6-d424a8d64539" (UID: "7845d8dc-4399-4450-b1d6-d424a8d64539"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:54:34 crc kubenswrapper[4774]: I1003 14:54:34.209683 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7845d8dc-4399-4450-b1d6-d424a8d64539-kube-api-access-7kfcq" (OuterVolumeSpecName: "kube-api-access-7kfcq") pod "7845d8dc-4399-4450-b1d6-d424a8d64539" (UID: "7845d8dc-4399-4450-b1d6-d424a8d64539"). InnerVolumeSpecName "kube-api-access-7kfcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:54:34 crc kubenswrapper[4774]: I1003 14:54:34.271898 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7845d8dc-4399-4450-b1d6-d424a8d64539-util" (OuterVolumeSpecName: "util") pod "7845d8dc-4399-4450-b1d6-d424a8d64539" (UID: "7845d8dc-4399-4450-b1d6-d424a8d64539"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:54:34 crc kubenswrapper[4774]: I1003 14:54:34.315807 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kfcq\" (UniqueName: \"kubernetes.io/projected/7845d8dc-4399-4450-b1d6-d424a8d64539-kube-api-access-7kfcq\") on node \"crc\" DevicePath \"\"" Oct 03 14:54:34 crc kubenswrapper[4774]: I1003 14:54:34.316063 4774 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7845d8dc-4399-4450-b1d6-d424a8d64539-util\") on node \"crc\" DevicePath \"\"" Oct 03 14:54:34 crc kubenswrapper[4774]: I1003 14:54:34.316083 4774 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7845d8dc-4399-4450-b1d6-d424a8d64539-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:54:34 crc kubenswrapper[4774]: I1003 14:54:34.945208 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm" event={"ID":"7845d8dc-4399-4450-b1d6-d424a8d64539","Type":"ContainerDied","Data":"7644cbe7ef2c7718871addcdd340dd50845f6361a904fdf1a74798203f8fbe4a"} Oct 03 14:54:34 crc kubenswrapper[4774]: I1003 14:54:34.945261 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7644cbe7ef2c7718871addcdd340dd50845f6361a904fdf1a74798203f8fbe4a" Oct 03 14:54:34 crc kubenswrapper[4774]: I1003 14:54:34.945416 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm" Oct 03 14:54:39 crc kubenswrapper[4774]: I1003 14:54:39.154620 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-59jwx"] Oct 03 14:54:39 crc kubenswrapper[4774]: E1003 14:54:39.155197 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7845d8dc-4399-4450-b1d6-d424a8d64539" containerName="extract" Oct 03 14:54:39 crc kubenswrapper[4774]: I1003 14:54:39.155216 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="7845d8dc-4399-4450-b1d6-d424a8d64539" containerName="extract" Oct 03 14:54:39 crc kubenswrapper[4774]: E1003 14:54:39.155228 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7845d8dc-4399-4450-b1d6-d424a8d64539" containerName="pull" Oct 03 14:54:39 crc kubenswrapper[4774]: I1003 14:54:39.155236 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="7845d8dc-4399-4450-b1d6-d424a8d64539" containerName="pull" Oct 03 14:54:39 crc kubenswrapper[4774]: E1003 14:54:39.155258 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7845d8dc-4399-4450-b1d6-d424a8d64539" containerName="util" Oct 03 14:54:39 crc kubenswrapper[4774]: I1003 14:54:39.155265 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="7845d8dc-4399-4450-b1d6-d424a8d64539" containerName="util" Oct 03 14:54:39 crc kubenswrapper[4774]: I1003 14:54:39.155388 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="7845d8dc-4399-4450-b1d6-d424a8d64539" containerName="extract" Oct 03 14:54:39 crc kubenswrapper[4774]: I1003 14:54:39.155889 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-59jwx" Oct 03 14:54:39 crc kubenswrapper[4774]: I1003 14:54:39.158702 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 03 14:54:39 crc kubenswrapper[4774]: I1003 14:54:39.158725 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 03 14:54:39 crc kubenswrapper[4774]: I1003 14:54:39.158792 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-mgcph" Oct 03 14:54:39 crc kubenswrapper[4774]: I1003 14:54:39.177082 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-59jwx"] Oct 03 14:54:39 crc kubenswrapper[4774]: I1003 14:54:39.276238 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k47x\" (UniqueName: \"kubernetes.io/projected/a0e3f6b9-0e1a-4627-aad6-b6252f7f4ab9-kube-api-access-7k47x\") pod \"nmstate-operator-858ddd8f98-59jwx\" (UID: \"a0e3f6b9-0e1a-4627-aad6-b6252f7f4ab9\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-59jwx" Oct 03 14:54:39 crc kubenswrapper[4774]: I1003 14:54:39.377510 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k47x\" (UniqueName: \"kubernetes.io/projected/a0e3f6b9-0e1a-4627-aad6-b6252f7f4ab9-kube-api-access-7k47x\") pod \"nmstate-operator-858ddd8f98-59jwx\" (UID: \"a0e3f6b9-0e1a-4627-aad6-b6252f7f4ab9\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-59jwx" Oct 03 14:54:39 crc kubenswrapper[4774]: I1003 14:54:39.397455 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k47x\" (UniqueName: \"kubernetes.io/projected/a0e3f6b9-0e1a-4627-aad6-b6252f7f4ab9-kube-api-access-7k47x\") pod \"nmstate-operator-858ddd8f98-59jwx\" (UID: \"a0e3f6b9-0e1a-4627-aad6-b6252f7f4ab9\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-59jwx" Oct 03 14:54:39 crc kubenswrapper[4774]: I1003 14:54:39.472293 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-59jwx" Oct 03 14:54:39 crc kubenswrapper[4774]: I1003 14:54:39.718305 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-59jwx"] Oct 03 14:54:39 crc kubenswrapper[4774]: I1003 14:54:39.972738 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-59jwx" event={"ID":"a0e3f6b9-0e1a-4627-aad6-b6252f7f4ab9","Type":"ContainerStarted","Data":"e928e6686b01b6ef1f8b86a902b8f6f0c85f514f9385eb1d1f4cd64f77b602f7"} Oct 03 14:54:43 crc kubenswrapper[4774]: I1003 14:54:43.000425 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-59jwx" event={"ID":"a0e3f6b9-0e1a-4627-aad6-b6252f7f4ab9","Type":"ContainerStarted","Data":"067dab949535f8aa01db67444f00943ea120ff8d97843f6f2a0770181b8e7f32"} Oct 03 14:54:43 crc kubenswrapper[4774]: I1003 14:54:43.018272 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-59jwx" podStartSLOduration=1.817997036 podStartE2EDuration="4.018243369s" podCreationTimestamp="2025-10-03 14:54:39 +0000 UTC" firstStartedPulling="2025-10-03 14:54:39.730603472 +0000 UTC m=+702.319806944" lastFinishedPulling="2025-10-03 14:54:41.930849825 +0000 UTC m=+704.520053277" observedRunningTime="2025-10-03 14:54:43.016441695 +0000 UTC m=+705.605645197" watchObservedRunningTime="2025-10-03 14:54:43.018243369 +0000 UTC m=+705.607446861" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.098715 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-4lb7n"] Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.099859 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-4lb7n" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.101851 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-xbl2t" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.113689 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-4lb7n"] Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.117337 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-hrzkg"] Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.118155 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-hrzkg" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.120364 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.150125 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-bc8h5"] Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.151009 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-bc8h5" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.153349 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-hrzkg"] Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.239936 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-p8hlt"] Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.240625 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-p8hlt" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.242487 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-4hfxc" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.242838 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.243167 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.250811 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-p8hlt"] Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.291549 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxw7k\" (UniqueName: \"kubernetes.io/projected/e582dd92-97d1-48bf-a81b-ad144d0a89cf-kube-api-access-kxw7k\") pod \"nmstate-webhook-6cdbc54649-hrzkg\" (UID: \"e582dd92-97d1-48bf-a81b-ad144d0a89cf\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-hrzkg" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.291600 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7f5h\" (UniqueName: \"kubernetes.io/projected/6aaa4a80-d658-4f5a-8b6c-cc84ad781c64-kube-api-access-z7f5h\") pod \"nmstate-handler-bc8h5\" (UID: \"6aaa4a80-d658-4f5a-8b6c-cc84ad781c64\") " pod="openshift-nmstate/nmstate-handler-bc8h5" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.291639 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6aaa4a80-d658-4f5a-8b6c-cc84ad781c64-nmstate-lock\") pod \"nmstate-handler-bc8h5\" (UID: \"6aaa4a80-d658-4f5a-8b6c-cc84ad781c64\") " pod="openshift-nmstate/nmstate-handler-bc8h5" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.291669 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6aaa4a80-d658-4f5a-8b6c-cc84ad781c64-dbus-socket\") pod \"nmstate-handler-bc8h5\" (UID: \"6aaa4a80-d658-4f5a-8b6c-cc84ad781c64\") " pod="openshift-nmstate/nmstate-handler-bc8h5" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.291701 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k57jg\" (UniqueName: \"kubernetes.io/projected/bd99f12e-3622-42d7-bece-2149da359b49-kube-api-access-k57jg\") pod \"nmstate-metrics-fdff9cb8d-4lb7n\" (UID: \"bd99f12e-3622-42d7-bece-2149da359b49\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-4lb7n" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.291722 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6aaa4a80-d658-4f5a-8b6c-cc84ad781c64-ovs-socket\") pod \"nmstate-handler-bc8h5\" (UID: \"6aaa4a80-d658-4f5a-8b6c-cc84ad781c64\") " pod="openshift-nmstate/nmstate-handler-bc8h5" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.291765 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e582dd92-97d1-48bf-a81b-ad144d0a89cf-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-hrzkg\" (UID: \"e582dd92-97d1-48bf-a81b-ad144d0a89cf\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-hrzkg" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.392657 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8677m\" (UniqueName: \"kubernetes.io/projected/e1390648-f53e-4c1c-a801-2c76be9fa959-kube-api-access-8677m\") pod \"nmstate-console-plugin-6b874cbd85-p8hlt\" (UID: \"e1390648-f53e-4c1c-a801-2c76be9fa959\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-p8hlt" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.392703 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e1390648-f53e-4c1c-a801-2c76be9fa959-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-p8hlt\" (UID: \"e1390648-f53e-4c1c-a801-2c76be9fa959\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-p8hlt" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.392729 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1390648-f53e-4c1c-a801-2c76be9fa959-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-p8hlt\" (UID: \"e1390648-f53e-4c1c-a801-2c76be9fa959\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-p8hlt" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.392766 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k57jg\" (UniqueName: \"kubernetes.io/projected/bd99f12e-3622-42d7-bece-2149da359b49-kube-api-access-k57jg\") pod \"nmstate-metrics-fdff9cb8d-4lb7n\" (UID: \"bd99f12e-3622-42d7-bece-2149da359b49\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-4lb7n" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.392792 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6aaa4a80-d658-4f5a-8b6c-cc84ad781c64-ovs-socket\") pod \"nmstate-handler-bc8h5\" (UID: \"6aaa4a80-d658-4f5a-8b6c-cc84ad781c64\") " pod="openshift-nmstate/nmstate-handler-bc8h5" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.392840 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e582dd92-97d1-48bf-a81b-ad144d0a89cf-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-hrzkg\" (UID: \"e582dd92-97d1-48bf-a81b-ad144d0a89cf\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-hrzkg" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.392872 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxw7k\" (UniqueName: \"kubernetes.io/projected/e582dd92-97d1-48bf-a81b-ad144d0a89cf-kube-api-access-kxw7k\") pod \"nmstate-webhook-6cdbc54649-hrzkg\" (UID: \"e582dd92-97d1-48bf-a81b-ad144d0a89cf\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-hrzkg" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.392899 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7f5h\" (UniqueName: \"kubernetes.io/projected/6aaa4a80-d658-4f5a-8b6c-cc84ad781c64-kube-api-access-z7f5h\") pod \"nmstate-handler-bc8h5\" (UID: \"6aaa4a80-d658-4f5a-8b6c-cc84ad781c64\") " pod="openshift-nmstate/nmstate-handler-bc8h5" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.392925 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6aaa4a80-d658-4f5a-8b6c-cc84ad781c64-nmstate-lock\") pod \"nmstate-handler-bc8h5\" (UID: \"6aaa4a80-d658-4f5a-8b6c-cc84ad781c64\") " pod="openshift-nmstate/nmstate-handler-bc8h5" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.392945 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6aaa4a80-d658-4f5a-8b6c-cc84ad781c64-dbus-socket\") pod \"nmstate-handler-bc8h5\" (UID: \"6aaa4a80-d658-4f5a-8b6c-cc84ad781c64\") " pod="openshift-nmstate/nmstate-handler-bc8h5" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.393153 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6aaa4a80-d658-4f5a-8b6c-cc84ad781c64-dbus-socket\") pod \"nmstate-handler-bc8h5\" (UID: \"6aaa4a80-d658-4f5a-8b6c-cc84ad781c64\") " pod="openshift-nmstate/nmstate-handler-bc8h5" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.393175 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6aaa4a80-d658-4f5a-8b6c-cc84ad781c64-ovs-socket\") pod \"nmstate-handler-bc8h5\" (UID: \"6aaa4a80-d658-4f5a-8b6c-cc84ad781c64\") " pod="openshift-nmstate/nmstate-handler-bc8h5" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.393427 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6aaa4a80-d658-4f5a-8b6c-cc84ad781c64-nmstate-lock\") pod \"nmstate-handler-bc8h5\" (UID: \"6aaa4a80-d658-4f5a-8b6c-cc84ad781c64\") " pod="openshift-nmstate/nmstate-handler-bc8h5" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.408249 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e582dd92-97d1-48bf-a81b-ad144d0a89cf-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-hrzkg\" (UID: \"e582dd92-97d1-48bf-a81b-ad144d0a89cf\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-hrzkg" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.414276 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxw7k\" (UniqueName: \"kubernetes.io/projected/e582dd92-97d1-48bf-a81b-ad144d0a89cf-kube-api-access-kxw7k\") pod \"nmstate-webhook-6cdbc54649-hrzkg\" (UID: \"e582dd92-97d1-48bf-a81b-ad144d0a89cf\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-hrzkg" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.416926 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7794bcf888-x7stf"] Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.417651 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7794bcf888-x7stf" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.419360 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k57jg\" (UniqueName: \"kubernetes.io/projected/bd99f12e-3622-42d7-bece-2149da359b49-kube-api-access-k57jg\") pod \"nmstate-metrics-fdff9cb8d-4lb7n\" (UID: \"bd99f12e-3622-42d7-bece-2149da359b49\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-4lb7n" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.426014 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7f5h\" (UniqueName: \"kubernetes.io/projected/6aaa4a80-d658-4f5a-8b6c-cc84ad781c64-kube-api-access-z7f5h\") pod \"nmstate-handler-bc8h5\" (UID: \"6aaa4a80-d658-4f5a-8b6c-cc84ad781c64\") " pod="openshift-nmstate/nmstate-handler-bc8h5" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.427480 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7794bcf888-x7stf"] Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.439998 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-hrzkg" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.474847 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-bc8h5" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.494566 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e1390648-f53e-4c1c-a801-2c76be9fa959-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-p8hlt\" (UID: \"e1390648-f53e-4c1c-a801-2c76be9fa959\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-p8hlt" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.494597 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8677m\" (UniqueName: \"kubernetes.io/projected/e1390648-f53e-4c1c-a801-2c76be9fa959-kube-api-access-8677m\") pod \"nmstate-console-plugin-6b874cbd85-p8hlt\" (UID: \"e1390648-f53e-4c1c-a801-2c76be9fa959\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-p8hlt" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.494622 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1390648-f53e-4c1c-a801-2c76be9fa959-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-p8hlt\" (UID: \"e1390648-f53e-4c1c-a801-2c76be9fa959\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-p8hlt" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.496153 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e1390648-f53e-4c1c-a801-2c76be9fa959-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-p8hlt\" (UID: \"e1390648-f53e-4c1c-a801-2c76be9fa959\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-p8hlt" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.498466 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1390648-f53e-4c1c-a801-2c76be9fa959-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-p8hlt\" (UID: \"e1390648-f53e-4c1c-a801-2c76be9fa959\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-p8hlt" Oct 03 14:54:48 crc kubenswrapper[4774]: W1003 14:54:48.502571 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6aaa4a80_d658_4f5a_8b6c_cc84ad781c64.slice/crio-1fccf63ab38fc979fb5698fdd10f158757855b3d4f94041593a072c4ee5ca92e WatchSource:0}: Error finding container 1fccf63ab38fc979fb5698fdd10f158757855b3d4f94041593a072c4ee5ca92e: Status 404 returned error can't find the container with id 1fccf63ab38fc979fb5698fdd10f158757855b3d4f94041593a072c4ee5ca92e Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.514158 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8677m\" (UniqueName: \"kubernetes.io/projected/e1390648-f53e-4c1c-a801-2c76be9fa959-kube-api-access-8677m\") pod \"nmstate-console-plugin-6b874cbd85-p8hlt\" (UID: \"e1390648-f53e-4c1c-a801-2c76be9fa959\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-p8hlt" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.556444 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-p8hlt" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.596805 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjp52\" (UniqueName: \"kubernetes.io/projected/57ccdf25-ed57-4336-8c39-31367c334c94-kube-api-access-gjp52\") pod \"console-7794bcf888-x7stf\" (UID: \"57ccdf25-ed57-4336-8c39-31367c334c94\") " pod="openshift-console/console-7794bcf888-x7stf" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.597150 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/57ccdf25-ed57-4336-8c39-31367c334c94-oauth-serving-cert\") pod \"console-7794bcf888-x7stf\" (UID: \"57ccdf25-ed57-4336-8c39-31367c334c94\") " pod="openshift-console/console-7794bcf888-x7stf" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.597178 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57ccdf25-ed57-4336-8c39-31367c334c94-trusted-ca-bundle\") pod \"console-7794bcf888-x7stf\" (UID: \"57ccdf25-ed57-4336-8c39-31367c334c94\") " pod="openshift-console/console-7794bcf888-x7stf" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.597206 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/57ccdf25-ed57-4336-8c39-31367c334c94-console-serving-cert\") pod \"console-7794bcf888-x7stf\" (UID: \"57ccdf25-ed57-4336-8c39-31367c334c94\") " pod="openshift-console/console-7794bcf888-x7stf" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.597234 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/57ccdf25-ed57-4336-8c39-31367c334c94-console-config\") pod \"console-7794bcf888-x7stf\" (UID: \"57ccdf25-ed57-4336-8c39-31367c334c94\") " pod="openshift-console/console-7794bcf888-x7stf" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.597256 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/57ccdf25-ed57-4336-8c39-31367c334c94-service-ca\") pod \"console-7794bcf888-x7stf\" (UID: \"57ccdf25-ed57-4336-8c39-31367c334c94\") " pod="openshift-console/console-7794bcf888-x7stf" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.597281 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/57ccdf25-ed57-4336-8c39-31367c334c94-console-oauth-config\") pod \"console-7794bcf888-x7stf\" (UID: \"57ccdf25-ed57-4336-8c39-31367c334c94\") " pod="openshift-console/console-7794bcf888-x7stf" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.698389 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/57ccdf25-ed57-4336-8c39-31367c334c94-console-serving-cert\") pod \"console-7794bcf888-x7stf\" (UID: \"57ccdf25-ed57-4336-8c39-31367c334c94\") " pod="openshift-console/console-7794bcf888-x7stf" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.698430 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/57ccdf25-ed57-4336-8c39-31367c334c94-console-config\") pod \"console-7794bcf888-x7stf\" (UID: \"57ccdf25-ed57-4336-8c39-31367c334c94\") " pod="openshift-console/console-7794bcf888-x7stf" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.698449 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/57ccdf25-ed57-4336-8c39-31367c334c94-service-ca\") pod \"console-7794bcf888-x7stf\" (UID: \"57ccdf25-ed57-4336-8c39-31367c334c94\") " pod="openshift-console/console-7794bcf888-x7stf" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.698467 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/57ccdf25-ed57-4336-8c39-31367c334c94-console-oauth-config\") pod \"console-7794bcf888-x7stf\" (UID: \"57ccdf25-ed57-4336-8c39-31367c334c94\") " pod="openshift-console/console-7794bcf888-x7stf" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.698518 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjp52\" (UniqueName: \"kubernetes.io/projected/57ccdf25-ed57-4336-8c39-31367c334c94-kube-api-access-gjp52\") pod \"console-7794bcf888-x7stf\" (UID: \"57ccdf25-ed57-4336-8c39-31367c334c94\") " pod="openshift-console/console-7794bcf888-x7stf" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.698557 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/57ccdf25-ed57-4336-8c39-31367c334c94-oauth-serving-cert\") pod \"console-7794bcf888-x7stf\" (UID: \"57ccdf25-ed57-4336-8c39-31367c334c94\") " pod="openshift-console/console-7794bcf888-x7stf" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.698574 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57ccdf25-ed57-4336-8c39-31367c334c94-trusted-ca-bundle\") pod \"console-7794bcf888-x7stf\" (UID: \"57ccdf25-ed57-4336-8c39-31367c334c94\") " pod="openshift-console/console-7794bcf888-x7stf" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.699301 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/57ccdf25-ed57-4336-8c39-31367c334c94-service-ca\") pod \"console-7794bcf888-x7stf\" (UID: \"57ccdf25-ed57-4336-8c39-31367c334c94\") " pod="openshift-console/console-7794bcf888-x7stf" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.699734 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57ccdf25-ed57-4336-8c39-31367c334c94-trusted-ca-bundle\") pod \"console-7794bcf888-x7stf\" (UID: \"57ccdf25-ed57-4336-8c39-31367c334c94\") " pod="openshift-console/console-7794bcf888-x7stf" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.699802 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/57ccdf25-ed57-4336-8c39-31367c334c94-oauth-serving-cert\") pod \"console-7794bcf888-x7stf\" (UID: \"57ccdf25-ed57-4336-8c39-31367c334c94\") " pod="openshift-console/console-7794bcf888-x7stf" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.700583 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/57ccdf25-ed57-4336-8c39-31367c334c94-console-config\") pod \"console-7794bcf888-x7stf\" (UID: \"57ccdf25-ed57-4336-8c39-31367c334c94\") " pod="openshift-console/console-7794bcf888-x7stf" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.705606 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/57ccdf25-ed57-4336-8c39-31367c334c94-console-serving-cert\") pod \"console-7794bcf888-x7stf\" (UID: \"57ccdf25-ed57-4336-8c39-31367c334c94\") " pod="openshift-console/console-7794bcf888-x7stf" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.706549 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/57ccdf25-ed57-4336-8c39-31367c334c94-console-oauth-config\") pod \"console-7794bcf888-x7stf\" (UID: \"57ccdf25-ed57-4336-8c39-31367c334c94\") " pod="openshift-console/console-7794bcf888-x7stf" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.716282 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-4lb7n" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.722254 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-p8hlt"] Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.723419 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjp52\" (UniqueName: \"kubernetes.io/projected/57ccdf25-ed57-4336-8c39-31367c334c94-kube-api-access-gjp52\") pod \"console-7794bcf888-x7stf\" (UID: \"57ccdf25-ed57-4336-8c39-31367c334c94\") " pod="openshift-console/console-7794bcf888-x7stf" Oct 03 14:54:48 crc kubenswrapper[4774]: W1003 14:54:48.729074 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1390648_f53e_4c1c_a801_2c76be9fa959.slice/crio-713309b82834b65684db579df7644bb5259999f079f930f34c77b69987ad7104 WatchSource:0}: Error finding container 713309b82834b65684db579df7644bb5259999f079f930f34c77b69987ad7104: Status 404 returned error can't find the container with id 713309b82834b65684db579df7644bb5259999f079f930f34c77b69987ad7104 Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.827223 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7794bcf888-x7stf" Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.881697 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-hrzkg"] Oct 03 14:54:48 crc kubenswrapper[4774]: I1003 14:54:48.923548 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-4lb7n"] Oct 03 14:54:48 crc kubenswrapper[4774]: W1003 14:54:48.954786 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd99f12e_3622_42d7_bece_2149da359b49.slice/crio-97eff73cdab52685c85beb87db938aa65b9a3df8c09b119256aa89113b7522da WatchSource:0}: Error finding container 97eff73cdab52685c85beb87db938aa65b9a3df8c09b119256aa89113b7522da: Status 404 returned error can't find the container with id 97eff73cdab52685c85beb87db938aa65b9a3df8c09b119256aa89113b7522da Oct 03 14:54:49 crc kubenswrapper[4774]: I1003 14:54:49.035045 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-p8hlt" event={"ID":"e1390648-f53e-4c1c-a801-2c76be9fa959","Type":"ContainerStarted","Data":"713309b82834b65684db579df7644bb5259999f079f930f34c77b69987ad7104"} Oct 03 14:54:49 crc kubenswrapper[4774]: I1003 14:54:49.035897 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-4lb7n" event={"ID":"bd99f12e-3622-42d7-bece-2149da359b49","Type":"ContainerStarted","Data":"97eff73cdab52685c85beb87db938aa65b9a3df8c09b119256aa89113b7522da"} Oct 03 14:54:49 crc kubenswrapper[4774]: I1003 14:54:49.036603 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-hrzkg" event={"ID":"e582dd92-97d1-48bf-a81b-ad144d0a89cf","Type":"ContainerStarted","Data":"5d5a9d5d43a01941adda49b8122b852f882a6cee939011400e0fe64a29028f16"} Oct 03 14:54:49 crc kubenswrapper[4774]: I1003 14:54:49.037327 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-bc8h5" event={"ID":"6aaa4a80-d658-4f5a-8b6c-cc84ad781c64","Type":"ContainerStarted","Data":"1fccf63ab38fc979fb5698fdd10f158757855b3d4f94041593a072c4ee5ca92e"} Oct 03 14:54:49 crc kubenswrapper[4774]: I1003 14:54:49.276929 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7794bcf888-x7stf"] Oct 03 14:54:49 crc kubenswrapper[4774]: W1003 14:54:49.296791 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57ccdf25_ed57_4336_8c39_31367c334c94.slice/crio-67967a71b9645264151e7e4f44519d23a8b547b375b48c4a92c8828b00980950 WatchSource:0}: Error finding container 67967a71b9645264151e7e4f44519d23a8b547b375b48c4a92c8828b00980950: Status 404 returned error can't find the container with id 67967a71b9645264151e7e4f44519d23a8b547b375b48c4a92c8828b00980950 Oct 03 14:54:50 crc kubenswrapper[4774]: I1003 14:54:50.048789 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7794bcf888-x7stf" event={"ID":"57ccdf25-ed57-4336-8c39-31367c334c94","Type":"ContainerStarted","Data":"a7a81b6877e2d69993f6122e82a5209c994effbe993674d49a8e1c466c0355a4"} Oct 03 14:54:50 crc kubenswrapper[4774]: I1003 14:54:50.048874 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7794bcf888-x7stf" event={"ID":"57ccdf25-ed57-4336-8c39-31367c334c94","Type":"ContainerStarted","Data":"67967a71b9645264151e7e4f44519d23a8b547b375b48c4a92c8828b00980950"} Oct 03 14:54:50 crc kubenswrapper[4774]: I1003 14:54:50.076207 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7794bcf888-x7stf" podStartSLOduration=2.076178332 podStartE2EDuration="2.076178332s" podCreationTimestamp="2025-10-03 14:54:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:54:50.073714531 +0000 UTC m=+712.662917993" watchObservedRunningTime="2025-10-03 14:54:50.076178332 +0000 UTC m=+712.665381824" Oct 03 14:54:52 crc kubenswrapper[4774]: I1003 14:54:52.062485 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-hrzkg" event={"ID":"e582dd92-97d1-48bf-a81b-ad144d0a89cf","Type":"ContainerStarted","Data":"54fa6fd266da260120c33190dd2ae8a5fd75abf74a40328b68cb93895d775506"} Oct 03 14:54:52 crc kubenswrapper[4774]: I1003 14:54:52.063160 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-hrzkg" Oct 03 14:54:52 crc kubenswrapper[4774]: I1003 14:54:52.065016 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-bc8h5" event={"ID":"6aaa4a80-d658-4f5a-8b6c-cc84ad781c64","Type":"ContainerStarted","Data":"a19f908af4df2281d8431763a157abcbbe5f8101028b68dd90d48b0bb515e356"} Oct 03 14:54:52 crc kubenswrapper[4774]: I1003 14:54:52.065151 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-bc8h5" Oct 03 14:54:52 crc kubenswrapper[4774]: I1003 14:54:52.080073 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-p8hlt" event={"ID":"e1390648-f53e-4c1c-a801-2c76be9fa959","Type":"ContainerStarted","Data":"84952233de551782c2c24227310a42bc3a4f13694298bbef8fe69ffc9dd02874"} Oct 03 14:54:52 crc kubenswrapper[4774]: I1003 14:54:52.083018 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-4lb7n" event={"ID":"bd99f12e-3622-42d7-bece-2149da359b49","Type":"ContainerStarted","Data":"a0e4a865ad26cc49b788912adcac83c3c3741fa4225f0e64e7c036ff3eaf501f"} Oct 03 14:54:52 crc kubenswrapper[4774]: I1003 14:54:52.093206 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-hrzkg" podStartSLOduration=1.243167248 podStartE2EDuration="4.093174888s" podCreationTimestamp="2025-10-03 14:54:48 +0000 UTC" firstStartedPulling="2025-10-03 14:54:48.895675319 +0000 UTC m=+711.484878771" lastFinishedPulling="2025-10-03 14:54:51.745682959 +0000 UTC m=+714.334886411" observedRunningTime="2025-10-03 14:54:52.090655685 +0000 UTC m=+714.679859157" watchObservedRunningTime="2025-10-03 14:54:52.093174888 +0000 UTC m=+714.682378340" Oct 03 14:54:52 crc kubenswrapper[4774]: I1003 14:54:52.116427 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-bc8h5" podStartSLOduration=0.898946347 podStartE2EDuration="4.116407877s" podCreationTimestamp="2025-10-03 14:54:48 +0000 UTC" firstStartedPulling="2025-10-03 14:54:48.504798336 +0000 UTC m=+711.094001798" lastFinishedPulling="2025-10-03 14:54:51.722259876 +0000 UTC m=+714.311463328" observedRunningTime="2025-10-03 14:54:52.114046668 +0000 UTC m=+714.703250130" watchObservedRunningTime="2025-10-03 14:54:52.116407877 +0000 UTC m=+714.705611339" Oct 03 14:54:52 crc kubenswrapper[4774]: I1003 14:54:52.131406 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-p8hlt" podStartSLOduration=1.140969003 podStartE2EDuration="4.13139228s" podCreationTimestamp="2025-10-03 14:54:48 +0000 UTC" firstStartedPulling="2025-10-03 14:54:48.731109581 +0000 UTC m=+711.320313033" lastFinishedPulling="2025-10-03 14:54:51.721532858 +0000 UTC m=+714.310736310" observedRunningTime="2025-10-03 14:54:52.129972445 +0000 UTC m=+714.719175917" watchObservedRunningTime="2025-10-03 14:54:52.13139228 +0000 UTC m=+714.720595732" Oct 03 14:54:55 crc kubenswrapper[4774]: I1003 14:54:55.132600 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-4lb7n" event={"ID":"bd99f12e-3622-42d7-bece-2149da359b49","Type":"ContainerStarted","Data":"1834b89ceb09494334ed79042ebd217617dc9aeb5270b685718ecd6746e26ae8"} Oct 03 14:54:55 crc kubenswrapper[4774]: I1003 14:54:55.169938 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-4lb7n" podStartSLOduration=1.659609092 podStartE2EDuration="7.169912502s" podCreationTimestamp="2025-10-03 14:54:48 +0000 UTC" firstStartedPulling="2025-10-03 14:54:48.956728279 +0000 UTC m=+711.545931731" lastFinishedPulling="2025-10-03 14:54:54.467031689 +0000 UTC m=+717.056235141" observedRunningTime="2025-10-03 14:54:55.16058208 +0000 UTC m=+717.749785532" watchObservedRunningTime="2025-10-03 14:54:55.169912502 +0000 UTC m=+717.759115984" Oct 03 14:54:58 crc kubenswrapper[4774]: I1003 14:54:58.506829 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-bc8h5" Oct 03 14:54:58 crc kubenswrapper[4774]: I1003 14:54:58.827899 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7794bcf888-x7stf" Oct 03 14:54:58 crc kubenswrapper[4774]: I1003 14:54:58.828017 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7794bcf888-x7stf" Oct 03 14:54:58 crc kubenswrapper[4774]: I1003 14:54:58.834631 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7794bcf888-x7stf" Oct 03 14:54:59 crc kubenswrapper[4774]: I1003 14:54:59.167770 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7794bcf888-x7stf" Oct 03 14:54:59 crc kubenswrapper[4774]: I1003 14:54:59.222215 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-vnvw7"] Oct 03 14:55:08 crc kubenswrapper[4774]: I1003 14:55:08.451053 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-hrzkg" Oct 03 14:55:20 crc kubenswrapper[4774]: I1003 14:55:20.653760 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:55:20 crc kubenswrapper[4774]: I1003 14:55:20.654473 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:55:22 crc kubenswrapper[4774]: I1003 14:55:22.795281 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m"] Oct 03 14:55:22 crc kubenswrapper[4774]: I1003 14:55:22.797245 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m" Oct 03 14:55:22 crc kubenswrapper[4774]: I1003 14:55:22.801401 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 03 14:55:22 crc kubenswrapper[4774]: I1003 14:55:22.806986 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m"] Oct 03 14:55:22 crc kubenswrapper[4774]: I1003 14:55:22.923492 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh852\" (UniqueName: \"kubernetes.io/projected/53c45343-23b3-4606-a5e1-bdd4c43b2752-kube-api-access-gh852\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m\" (UID: \"53c45343-23b3-4606-a5e1-bdd4c43b2752\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m" Oct 03 14:55:22 crc kubenswrapper[4774]: I1003 14:55:22.923607 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53c45343-23b3-4606-a5e1-bdd4c43b2752-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m\" (UID: \"53c45343-23b3-4606-a5e1-bdd4c43b2752\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m" Oct 03 14:55:22 crc kubenswrapper[4774]: I1003 14:55:22.923668 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53c45343-23b3-4606-a5e1-bdd4c43b2752-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m\" (UID: \"53c45343-23b3-4606-a5e1-bdd4c43b2752\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m" Oct 03 14:55:23 crc kubenswrapper[4774]: I1003 14:55:23.025548 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53c45343-23b3-4606-a5e1-bdd4c43b2752-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m\" (UID: \"53c45343-23b3-4606-a5e1-bdd4c43b2752\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m" Oct 03 14:55:23 crc kubenswrapper[4774]: I1003 14:55:23.025682 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53c45343-23b3-4606-a5e1-bdd4c43b2752-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m\" (UID: \"53c45343-23b3-4606-a5e1-bdd4c43b2752\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m" Oct 03 14:55:23 crc kubenswrapper[4774]: I1003 14:55:23.025776 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh852\" (UniqueName: \"kubernetes.io/projected/53c45343-23b3-4606-a5e1-bdd4c43b2752-kube-api-access-gh852\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m\" (UID: \"53c45343-23b3-4606-a5e1-bdd4c43b2752\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m" Oct 03 14:55:23 crc kubenswrapper[4774]: I1003 14:55:23.026266 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53c45343-23b3-4606-a5e1-bdd4c43b2752-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m\" (UID: \"53c45343-23b3-4606-a5e1-bdd4c43b2752\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m" Oct 03 14:55:23 crc kubenswrapper[4774]: I1003 14:55:23.026833 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53c45343-23b3-4606-a5e1-bdd4c43b2752-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m\" (UID: \"53c45343-23b3-4606-a5e1-bdd4c43b2752\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m" Oct 03 14:55:23 crc kubenswrapper[4774]: I1003 14:55:23.060562 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh852\" (UniqueName: \"kubernetes.io/projected/53c45343-23b3-4606-a5e1-bdd4c43b2752-kube-api-access-gh852\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m\" (UID: \"53c45343-23b3-4606-a5e1-bdd4c43b2752\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m" Oct 03 14:55:23 crc kubenswrapper[4774]: I1003 14:55:23.131047 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m" Oct 03 14:55:23 crc kubenswrapper[4774]: I1003 14:55:23.673284 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m"] Oct 03 14:55:23 crc kubenswrapper[4774]: W1003 14:55:23.679786 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53c45343_23b3_4606_a5e1_bdd4c43b2752.slice/crio-0ddb6f2bfd38be69893bbf739e2c0c098d27a041b6d49e220525d3ec22a9a117 WatchSource:0}: Error finding container 0ddb6f2bfd38be69893bbf739e2c0c098d27a041b6d49e220525d3ec22a9a117: Status 404 returned error can't find the container with id 0ddb6f2bfd38be69893bbf739e2c0c098d27a041b6d49e220525d3ec22a9a117 Oct 03 14:55:24 crc kubenswrapper[4774]: I1003 14:55:24.258210 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-vnvw7" podUID="9cae35f2-fcf0-4014-9b5b-9887d416e8d3" containerName="console" containerID="cri-o://3f4a8a0cb217cd6b8c10bbac0087200b71b5c5ae918194f1419ee50edf7d34c2" gracePeriod=15 Oct 03 14:55:24 crc kubenswrapper[4774]: I1003 14:55:24.326740 4774 generic.go:334] "Generic (PLEG): container finished" podID="53c45343-23b3-4606-a5e1-bdd4c43b2752" containerID="9a2f0135065204551cfb59d5d18a613ab9799aa144209b5d1b855682c9261c69" exitCode=0 Oct 03 14:55:24 crc kubenswrapper[4774]: I1003 14:55:24.326791 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m" event={"ID":"53c45343-23b3-4606-a5e1-bdd4c43b2752","Type":"ContainerDied","Data":"9a2f0135065204551cfb59d5d18a613ab9799aa144209b5d1b855682c9261c69"} Oct 03 14:55:24 crc kubenswrapper[4774]: I1003 14:55:24.326835 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m" event={"ID":"53c45343-23b3-4606-a5e1-bdd4c43b2752","Type":"ContainerStarted","Data":"0ddb6f2bfd38be69893bbf739e2c0c098d27a041b6d49e220525d3ec22a9a117"} Oct 03 14:55:24 crc kubenswrapper[4774]: I1003 14:55:24.680847 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-vnvw7_9cae35f2-fcf0-4014-9b5b-9887d416e8d3/console/0.log" Oct 03 14:55:24 crc kubenswrapper[4774]: I1003 14:55:24.680941 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vnvw7" Oct 03 14:55:24 crc kubenswrapper[4774]: I1003 14:55:24.854522 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-console-config\") pod \"9cae35f2-fcf0-4014-9b5b-9887d416e8d3\" (UID: \"9cae35f2-fcf0-4014-9b5b-9887d416e8d3\") " Oct 03 14:55:24 crc kubenswrapper[4774]: I1003 14:55:24.854821 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-console-serving-cert\") pod \"9cae35f2-fcf0-4014-9b5b-9887d416e8d3\" (UID: \"9cae35f2-fcf0-4014-9b5b-9887d416e8d3\") " Oct 03 14:55:24 crc kubenswrapper[4774]: I1003 14:55:24.855046 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-trusted-ca-bundle\") pod \"9cae35f2-fcf0-4014-9b5b-9887d416e8d3\" (UID: \"9cae35f2-fcf0-4014-9b5b-9887d416e8d3\") " Oct 03 14:55:24 crc kubenswrapper[4774]: I1003 14:55:24.855237 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-service-ca\") pod \"9cae35f2-fcf0-4014-9b5b-9887d416e8d3\" (UID: \"9cae35f2-fcf0-4014-9b5b-9887d416e8d3\") " Oct 03 14:55:24 crc kubenswrapper[4774]: I1003 14:55:24.855322 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwz26\" (UniqueName: \"kubernetes.io/projected/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-kube-api-access-vwz26\") pod \"9cae35f2-fcf0-4014-9b5b-9887d416e8d3\" (UID: \"9cae35f2-fcf0-4014-9b5b-9887d416e8d3\") " Oct 03 14:55:24 crc kubenswrapper[4774]: I1003 14:55:24.855418 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-oauth-serving-cert\") pod \"9cae35f2-fcf0-4014-9b5b-9887d416e8d3\" (UID: \"9cae35f2-fcf0-4014-9b5b-9887d416e8d3\") " Oct 03 14:55:24 crc kubenswrapper[4774]: I1003 14:55:24.855511 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-console-oauth-config\") pod \"9cae35f2-fcf0-4014-9b5b-9887d416e8d3\" (UID: \"9cae35f2-fcf0-4014-9b5b-9887d416e8d3\") " Oct 03 14:55:24 crc kubenswrapper[4774]: I1003 14:55:24.857614 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9cae35f2-fcf0-4014-9b5b-9887d416e8d3" (UID: "9cae35f2-fcf0-4014-9b5b-9887d416e8d3"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:55:24 crc kubenswrapper[4774]: I1003 14:55:24.858229 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-service-ca" (OuterVolumeSpecName: "service-ca") pod "9cae35f2-fcf0-4014-9b5b-9887d416e8d3" (UID: "9cae35f2-fcf0-4014-9b5b-9887d416e8d3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:55:24 crc kubenswrapper[4774]: I1003 14:55:24.859311 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9cae35f2-fcf0-4014-9b5b-9887d416e8d3" (UID: "9cae35f2-fcf0-4014-9b5b-9887d416e8d3"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:55:24 crc kubenswrapper[4774]: I1003 14:55:24.860582 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-console-config" (OuterVolumeSpecName: "console-config") pod "9cae35f2-fcf0-4014-9b5b-9887d416e8d3" (UID: "9cae35f2-fcf0-4014-9b5b-9887d416e8d3"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:55:24 crc kubenswrapper[4774]: I1003 14:55:24.866681 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-kube-api-access-vwz26" (OuterVolumeSpecName: "kube-api-access-vwz26") pod "9cae35f2-fcf0-4014-9b5b-9887d416e8d3" (UID: "9cae35f2-fcf0-4014-9b5b-9887d416e8d3"). InnerVolumeSpecName "kube-api-access-vwz26". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:55:24 crc kubenswrapper[4774]: I1003 14:55:24.866795 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9cae35f2-fcf0-4014-9b5b-9887d416e8d3" (UID: "9cae35f2-fcf0-4014-9b5b-9887d416e8d3"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:55:24 crc kubenswrapper[4774]: I1003 14:55:24.873712 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9cae35f2-fcf0-4014-9b5b-9887d416e8d3" (UID: "9cae35f2-fcf0-4014-9b5b-9887d416e8d3"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:55:24 crc kubenswrapper[4774]: I1003 14:55:24.958045 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:55:24 crc kubenswrapper[4774]: I1003 14:55:24.958076 4774 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:55:24 crc kubenswrapper[4774]: I1003 14:55:24.958085 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwz26\" (UniqueName: \"kubernetes.io/projected/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-kube-api-access-vwz26\") on node \"crc\" DevicePath \"\"" Oct 03 14:55:24 crc kubenswrapper[4774]: I1003 14:55:24.958095 4774 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:55:24 crc kubenswrapper[4774]: I1003 14:55:24.958106 4774 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:55:24 crc kubenswrapper[4774]: I1003 14:55:24.958113 4774 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-console-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:55:24 crc kubenswrapper[4774]: I1003 14:55:24.958121 4774 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9cae35f2-fcf0-4014-9b5b-9887d416e8d3-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:55:25 crc kubenswrapper[4774]: I1003 14:55:25.336532 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-vnvw7_9cae35f2-fcf0-4014-9b5b-9887d416e8d3/console/0.log" Oct 03 14:55:25 crc kubenswrapper[4774]: I1003 14:55:25.336597 4774 generic.go:334] "Generic (PLEG): container finished" podID="9cae35f2-fcf0-4014-9b5b-9887d416e8d3" containerID="3f4a8a0cb217cd6b8c10bbac0087200b71b5c5ae918194f1419ee50edf7d34c2" exitCode=2 Oct 03 14:55:25 crc kubenswrapper[4774]: I1003 14:55:25.336641 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vnvw7" event={"ID":"9cae35f2-fcf0-4014-9b5b-9887d416e8d3","Type":"ContainerDied","Data":"3f4a8a0cb217cd6b8c10bbac0087200b71b5c5ae918194f1419ee50edf7d34c2"} Oct 03 14:55:25 crc kubenswrapper[4774]: I1003 14:55:25.336682 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vnvw7" event={"ID":"9cae35f2-fcf0-4014-9b5b-9887d416e8d3","Type":"ContainerDied","Data":"b6c407d268222c6dab8c929e3b39f6c883db20758277461ca5ebab772f9fe318"} Oct 03 14:55:25 crc kubenswrapper[4774]: I1003 14:55:25.336709 4774 scope.go:117] "RemoveContainer" containerID="3f4a8a0cb217cd6b8c10bbac0087200b71b5c5ae918194f1419ee50edf7d34c2" Oct 03 14:55:25 crc kubenswrapper[4774]: I1003 14:55:25.336881 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vnvw7" Oct 03 14:55:25 crc kubenswrapper[4774]: I1003 14:55:25.361151 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-vnvw7"] Oct 03 14:55:25 crc kubenswrapper[4774]: I1003 14:55:25.367043 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-vnvw7"] Oct 03 14:55:25 crc kubenswrapper[4774]: I1003 14:55:25.374933 4774 scope.go:117] "RemoveContainer" containerID="3f4a8a0cb217cd6b8c10bbac0087200b71b5c5ae918194f1419ee50edf7d34c2" Oct 03 14:55:25 crc kubenswrapper[4774]: E1003 14:55:25.375929 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f4a8a0cb217cd6b8c10bbac0087200b71b5c5ae918194f1419ee50edf7d34c2\": container with ID starting with 3f4a8a0cb217cd6b8c10bbac0087200b71b5c5ae918194f1419ee50edf7d34c2 not found: ID does not exist" containerID="3f4a8a0cb217cd6b8c10bbac0087200b71b5c5ae918194f1419ee50edf7d34c2" Oct 03 14:55:25 crc kubenswrapper[4774]: I1003 14:55:25.375980 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f4a8a0cb217cd6b8c10bbac0087200b71b5c5ae918194f1419ee50edf7d34c2"} err="failed to get container status \"3f4a8a0cb217cd6b8c10bbac0087200b71b5c5ae918194f1419ee50edf7d34c2\": rpc error: code = NotFound desc = could not find container \"3f4a8a0cb217cd6b8c10bbac0087200b71b5c5ae918194f1419ee50edf7d34c2\": container with ID starting with 3f4a8a0cb217cd6b8c10bbac0087200b71b5c5ae918194f1419ee50edf7d34c2 not found: ID does not exist" Oct 03 14:55:26 crc kubenswrapper[4774]: I1003 14:55:26.348439 4774 generic.go:334] "Generic (PLEG): container finished" podID="53c45343-23b3-4606-a5e1-bdd4c43b2752" containerID="2af570cd43b350b219fd241a82536fdbfd3d0ddfe1c0d4bdaabe415566d8136c" exitCode=0 Oct 03 14:55:26 crc kubenswrapper[4774]: I1003 14:55:26.348498 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m" event={"ID":"53c45343-23b3-4606-a5e1-bdd4c43b2752","Type":"ContainerDied","Data":"2af570cd43b350b219fd241a82536fdbfd3d0ddfe1c0d4bdaabe415566d8136c"} Oct 03 14:55:27 crc kubenswrapper[4774]: I1003 14:55:27.313776 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cae35f2-fcf0-4014-9b5b-9887d416e8d3" path="/var/lib/kubelet/pods/9cae35f2-fcf0-4014-9b5b-9887d416e8d3/volumes" Oct 03 14:55:27 crc kubenswrapper[4774]: I1003 14:55:27.355910 4774 generic.go:334] "Generic (PLEG): container finished" podID="53c45343-23b3-4606-a5e1-bdd4c43b2752" containerID="ab9af736e4af8579c98f44f63abfa51bb923c1df483db7077d2dfc8f9d921bb5" exitCode=0 Oct 03 14:55:27 crc kubenswrapper[4774]: I1003 14:55:27.356015 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m" event={"ID":"53c45343-23b3-4606-a5e1-bdd4c43b2752","Type":"ContainerDied","Data":"ab9af736e4af8579c98f44f63abfa51bb923c1df483db7077d2dfc8f9d921bb5"} Oct 03 14:55:28 crc kubenswrapper[4774]: I1003 14:55:28.633027 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m" Oct 03 14:55:28 crc kubenswrapper[4774]: I1003 14:55:28.814961 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53c45343-23b3-4606-a5e1-bdd4c43b2752-bundle\") pod \"53c45343-23b3-4606-a5e1-bdd4c43b2752\" (UID: \"53c45343-23b3-4606-a5e1-bdd4c43b2752\") " Oct 03 14:55:28 crc kubenswrapper[4774]: I1003 14:55:28.815136 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh852\" (UniqueName: \"kubernetes.io/projected/53c45343-23b3-4606-a5e1-bdd4c43b2752-kube-api-access-gh852\") pod \"53c45343-23b3-4606-a5e1-bdd4c43b2752\" (UID: \"53c45343-23b3-4606-a5e1-bdd4c43b2752\") " Oct 03 14:55:28 crc kubenswrapper[4774]: I1003 14:55:28.815172 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53c45343-23b3-4606-a5e1-bdd4c43b2752-util\") pod \"53c45343-23b3-4606-a5e1-bdd4c43b2752\" (UID: \"53c45343-23b3-4606-a5e1-bdd4c43b2752\") " Oct 03 14:55:28 crc kubenswrapper[4774]: I1003 14:55:28.816695 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53c45343-23b3-4606-a5e1-bdd4c43b2752-bundle" (OuterVolumeSpecName: "bundle") pod "53c45343-23b3-4606-a5e1-bdd4c43b2752" (UID: "53c45343-23b3-4606-a5e1-bdd4c43b2752"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:55:28 crc kubenswrapper[4774]: I1003 14:55:28.824917 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53c45343-23b3-4606-a5e1-bdd4c43b2752-kube-api-access-gh852" (OuterVolumeSpecName: "kube-api-access-gh852") pod "53c45343-23b3-4606-a5e1-bdd4c43b2752" (UID: "53c45343-23b3-4606-a5e1-bdd4c43b2752"). InnerVolumeSpecName "kube-api-access-gh852". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:55:28 crc kubenswrapper[4774]: I1003 14:55:28.837786 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53c45343-23b3-4606-a5e1-bdd4c43b2752-util" (OuterVolumeSpecName: "util") pod "53c45343-23b3-4606-a5e1-bdd4c43b2752" (UID: "53c45343-23b3-4606-a5e1-bdd4c43b2752"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:55:28 crc kubenswrapper[4774]: I1003 14:55:28.916498 4774 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53c45343-23b3-4606-a5e1-bdd4c43b2752-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:55:28 crc kubenswrapper[4774]: I1003 14:55:28.916555 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh852\" (UniqueName: \"kubernetes.io/projected/53c45343-23b3-4606-a5e1-bdd4c43b2752-kube-api-access-gh852\") on node \"crc\" DevicePath \"\"" Oct 03 14:55:28 crc kubenswrapper[4774]: I1003 14:55:28.916581 4774 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53c45343-23b3-4606-a5e1-bdd4c43b2752-util\") on node \"crc\" DevicePath \"\"" Oct 03 14:55:29 crc kubenswrapper[4774]: I1003 14:55:29.371401 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m" event={"ID":"53c45343-23b3-4606-a5e1-bdd4c43b2752","Type":"ContainerDied","Data":"0ddb6f2bfd38be69893bbf739e2c0c098d27a041b6d49e220525d3ec22a9a117"} Oct 03 14:55:29 crc kubenswrapper[4774]: I1003 14:55:29.371474 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ddb6f2bfd38be69893bbf739e2c0c098d27a041b6d49e220525d3ec22a9a117" Oct 03 14:55:29 crc kubenswrapper[4774]: I1003 14:55:29.371556 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m" Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.175635 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vpxwc"] Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.176297 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-vpxwc" podUID="936c81dd-638d-4776-86fe-54a8aa53e50e" containerName="controller-manager" containerID="cri-o://a66f799d2dc18b24c2fdc49069ffea0ad7a3b04af0842ba43d4a8f2c85e9aa14" gracePeriod=30 Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.232499 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrmzx"] Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.232693 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrmzx" podUID="004c9445-7b3b-4479-bf6e-e6d880e4c7bb" containerName="route-controller-manager" containerID="cri-o://669804207e1ec76352a5affca2445e1821060af89b1e9c1b67bf41f81ac12332" gracePeriod=30 Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.409921 4774 generic.go:334] "Generic (PLEG): container finished" podID="936c81dd-638d-4776-86fe-54a8aa53e50e" containerID="a66f799d2dc18b24c2fdc49069ffea0ad7a3b04af0842ba43d4a8f2c85e9aa14" exitCode=0 Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.410004 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vpxwc" event={"ID":"936c81dd-638d-4776-86fe-54a8aa53e50e","Type":"ContainerDied","Data":"a66f799d2dc18b24c2fdc49069ffea0ad7a3b04af0842ba43d4a8f2c85e9aa14"} Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.412819 4774 generic.go:334] "Generic (PLEG): container finished" podID="004c9445-7b3b-4479-bf6e-e6d880e4c7bb" containerID="669804207e1ec76352a5affca2445e1821060af89b1e9c1b67bf41f81ac12332" exitCode=0 Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.412856 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrmzx" event={"ID":"004c9445-7b3b-4479-bf6e-e6d880e4c7bb","Type":"ContainerDied","Data":"669804207e1ec76352a5affca2445e1821060af89b1e9c1b67bf41f81ac12332"} Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.584339 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vpxwc" Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.648929 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrmzx" Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.682014 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcfcz\" (UniqueName: \"kubernetes.io/projected/936c81dd-638d-4776-86fe-54a8aa53e50e-kube-api-access-vcfcz\") pod \"936c81dd-638d-4776-86fe-54a8aa53e50e\" (UID: \"936c81dd-638d-4776-86fe-54a8aa53e50e\") " Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.682075 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/936c81dd-638d-4776-86fe-54a8aa53e50e-serving-cert\") pod \"936c81dd-638d-4776-86fe-54a8aa53e50e\" (UID: \"936c81dd-638d-4776-86fe-54a8aa53e50e\") " Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.682127 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/936c81dd-638d-4776-86fe-54a8aa53e50e-config\") pod \"936c81dd-638d-4776-86fe-54a8aa53e50e\" (UID: \"936c81dd-638d-4776-86fe-54a8aa53e50e\") " Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.682173 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/936c81dd-638d-4776-86fe-54a8aa53e50e-proxy-ca-bundles\") pod \"936c81dd-638d-4776-86fe-54a8aa53e50e\" (UID: \"936c81dd-638d-4776-86fe-54a8aa53e50e\") " Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.682224 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/936c81dd-638d-4776-86fe-54a8aa53e50e-client-ca\") pod \"936c81dd-638d-4776-86fe-54a8aa53e50e\" (UID: \"936c81dd-638d-4776-86fe-54a8aa53e50e\") " Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.682979 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/936c81dd-638d-4776-86fe-54a8aa53e50e-client-ca" (OuterVolumeSpecName: "client-ca") pod "936c81dd-638d-4776-86fe-54a8aa53e50e" (UID: "936c81dd-638d-4776-86fe-54a8aa53e50e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.683027 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/936c81dd-638d-4776-86fe-54a8aa53e50e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "936c81dd-638d-4776-86fe-54a8aa53e50e" (UID: "936c81dd-638d-4776-86fe-54a8aa53e50e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.683063 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/936c81dd-638d-4776-86fe-54a8aa53e50e-config" (OuterVolumeSpecName: "config") pod "936c81dd-638d-4776-86fe-54a8aa53e50e" (UID: "936c81dd-638d-4776-86fe-54a8aa53e50e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.694548 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/936c81dd-638d-4776-86fe-54a8aa53e50e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "936c81dd-638d-4776-86fe-54a8aa53e50e" (UID: "936c81dd-638d-4776-86fe-54a8aa53e50e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.697063 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/936c81dd-638d-4776-86fe-54a8aa53e50e-kube-api-access-vcfcz" (OuterVolumeSpecName: "kube-api-access-vcfcz") pod "936c81dd-638d-4776-86fe-54a8aa53e50e" (UID: "936c81dd-638d-4776-86fe-54a8aa53e50e"). InnerVolumeSpecName "kube-api-access-vcfcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.783166 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/004c9445-7b3b-4479-bf6e-e6d880e4c7bb-serving-cert\") pod \"004c9445-7b3b-4479-bf6e-e6d880e4c7bb\" (UID: \"004c9445-7b3b-4479-bf6e-e6d880e4c7bb\") " Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.783250 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/004c9445-7b3b-4479-bf6e-e6d880e4c7bb-config\") pod \"004c9445-7b3b-4479-bf6e-e6d880e4c7bb\" (UID: \"004c9445-7b3b-4479-bf6e-e6d880e4c7bb\") " Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.783280 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwmnx\" (UniqueName: \"kubernetes.io/projected/004c9445-7b3b-4479-bf6e-e6d880e4c7bb-kube-api-access-zwmnx\") pod \"004c9445-7b3b-4479-bf6e-e6d880e4c7bb\" (UID: \"004c9445-7b3b-4479-bf6e-e6d880e4c7bb\") " Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.783335 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/004c9445-7b3b-4479-bf6e-e6d880e4c7bb-client-ca\") pod \"004c9445-7b3b-4479-bf6e-e6d880e4c7bb\" (UID: \"004c9445-7b3b-4479-bf6e-e6d880e4c7bb\") " Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.783598 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcfcz\" (UniqueName: \"kubernetes.io/projected/936c81dd-638d-4776-86fe-54a8aa53e50e-kube-api-access-vcfcz\") on node \"crc\" DevicePath \"\"" Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.783621 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/936c81dd-638d-4776-86fe-54a8aa53e50e-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.783634 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/936c81dd-638d-4776-86fe-54a8aa53e50e-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.783645 4774 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/936c81dd-638d-4776-86fe-54a8aa53e50e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.783656 4774 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/936c81dd-638d-4776-86fe-54a8aa53e50e-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.783793 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/004c9445-7b3b-4479-bf6e-e6d880e4c7bb-config" (OuterVolumeSpecName: "config") pod "004c9445-7b3b-4479-bf6e-e6d880e4c7bb" (UID: "004c9445-7b3b-4479-bf6e-e6d880e4c7bb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.783803 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/004c9445-7b3b-4479-bf6e-e6d880e4c7bb-client-ca" (OuterVolumeSpecName: "client-ca") pod "004c9445-7b3b-4479-bf6e-e6d880e4c7bb" (UID: "004c9445-7b3b-4479-bf6e-e6d880e4c7bb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.786063 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/004c9445-7b3b-4479-bf6e-e6d880e4c7bb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "004c9445-7b3b-4479-bf6e-e6d880e4c7bb" (UID: "004c9445-7b3b-4479-bf6e-e6d880e4c7bb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.786095 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/004c9445-7b3b-4479-bf6e-e6d880e4c7bb-kube-api-access-zwmnx" (OuterVolumeSpecName: "kube-api-access-zwmnx") pod "004c9445-7b3b-4479-bf6e-e6d880e4c7bb" (UID: "004c9445-7b3b-4479-bf6e-e6d880e4c7bb"). InnerVolumeSpecName "kube-api-access-zwmnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.884811 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/004c9445-7b3b-4479-bf6e-e6d880e4c7bb-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.884851 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/004c9445-7b3b-4479-bf6e-e6d880e4c7bb-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.884867 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwmnx\" (UniqueName: \"kubernetes.io/projected/004c9445-7b3b-4479-bf6e-e6d880e4c7bb-kube-api-access-zwmnx\") on node \"crc\" DevicePath \"\"" Oct 03 14:55:33 crc kubenswrapper[4774]: I1003 14:55:33.884881 4774 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/004c9445-7b3b-4479-bf6e-e6d880e4c7bb-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.419803 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vpxwc" event={"ID":"936c81dd-638d-4776-86fe-54a8aa53e50e","Type":"ContainerDied","Data":"8f1f4acc6d1aaba5181a3dcccc9fa5d2ab7f5d710936413b0d4d9db38922dfbb"} Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.420267 4774 scope.go:117] "RemoveContainer" containerID="a66f799d2dc18b24c2fdc49069ffea0ad7a3b04af0842ba43d4a8f2c85e9aa14" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.419845 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vpxwc" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.421424 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrmzx" event={"ID":"004c9445-7b3b-4479-bf6e-e6d880e4c7bb","Type":"ContainerDied","Data":"44d59f19cad3c4eb7cedc7ce28146ec22b2487fa1f51a5a5bd41ccbb40920bb5"} Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.421504 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrmzx" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.443552 4774 scope.go:117] "RemoveContainer" containerID="669804207e1ec76352a5affca2445e1821060af89b1e9c1b67bf41f81ac12332" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.458251 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vpxwc"] Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.468158 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vpxwc"] Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.474545 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrmzx"] Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.478090 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrmzx"] Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.691404 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-8fd4d6ccc-vmvl4"] Oct 03 14:55:34 crc kubenswrapper[4774]: E1003 14:55:34.691902 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="004c9445-7b3b-4479-bf6e-e6d880e4c7bb" containerName="route-controller-manager" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.691922 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="004c9445-7b3b-4479-bf6e-e6d880e4c7bb" containerName="route-controller-manager" Oct 03 14:55:34 crc kubenswrapper[4774]: E1003 14:55:34.691942 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cae35f2-fcf0-4014-9b5b-9887d416e8d3" containerName="console" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.691949 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cae35f2-fcf0-4014-9b5b-9887d416e8d3" containerName="console" Oct 03 14:55:34 crc kubenswrapper[4774]: E1003 14:55:34.691959 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53c45343-23b3-4606-a5e1-bdd4c43b2752" containerName="util" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.691967 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="53c45343-23b3-4606-a5e1-bdd4c43b2752" containerName="util" Oct 03 14:55:34 crc kubenswrapper[4774]: E1003 14:55:34.691979 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="936c81dd-638d-4776-86fe-54a8aa53e50e" containerName="controller-manager" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.691987 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="936c81dd-638d-4776-86fe-54a8aa53e50e" containerName="controller-manager" Oct 03 14:55:34 crc kubenswrapper[4774]: E1003 14:55:34.691996 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53c45343-23b3-4606-a5e1-bdd4c43b2752" containerName="pull" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.692003 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="53c45343-23b3-4606-a5e1-bdd4c43b2752" containerName="pull" Oct 03 14:55:34 crc kubenswrapper[4774]: E1003 14:55:34.692011 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53c45343-23b3-4606-a5e1-bdd4c43b2752" containerName="extract" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.692019 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="53c45343-23b3-4606-a5e1-bdd4c43b2752" containerName="extract" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.692135 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="004c9445-7b3b-4479-bf6e-e6d880e4c7bb" containerName="route-controller-manager" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.692153 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="936c81dd-638d-4776-86fe-54a8aa53e50e" containerName="controller-manager" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.692163 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="53c45343-23b3-4606-a5e1-bdd4c43b2752" containerName="extract" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.692172 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cae35f2-fcf0-4014-9b5b-9887d416e8d3" containerName="console" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.692817 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8fd4d6ccc-vmvl4" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.698913 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.701716 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.702151 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.703156 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.707863 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.709571 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5ffdbbcc98-7szm6"] Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.715220 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5ffdbbcc98-7szm6" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.717914 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.720980 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.722549 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.722746 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.722889 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.723037 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.723252 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.723397 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.724875 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8fd4d6ccc-vmvl4"] Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.740269 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5ffdbbcc98-7szm6"] Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.794433 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0a2cf15-bfac-40d4-afde-e232be9ff1d4-proxy-ca-bundles\") pod \"controller-manager-8fd4d6ccc-vmvl4\" (UID: \"e0a2cf15-bfac-40d4-afde-e232be9ff1d4\") " pod="openshift-controller-manager/controller-manager-8fd4d6ccc-vmvl4" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.794476 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0a2cf15-bfac-40d4-afde-e232be9ff1d4-serving-cert\") pod \"controller-manager-8fd4d6ccc-vmvl4\" (UID: \"e0a2cf15-bfac-40d4-afde-e232be9ff1d4\") " pod="openshift-controller-manager/controller-manager-8fd4d6ccc-vmvl4" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.794508 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0a2cf15-bfac-40d4-afde-e232be9ff1d4-config\") pod \"controller-manager-8fd4d6ccc-vmvl4\" (UID: \"e0a2cf15-bfac-40d4-afde-e232be9ff1d4\") " pod="openshift-controller-manager/controller-manager-8fd4d6ccc-vmvl4" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.794531 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc4d4\" (UniqueName: \"kubernetes.io/projected/e0a2cf15-bfac-40d4-afde-e232be9ff1d4-kube-api-access-gc4d4\") pod \"controller-manager-8fd4d6ccc-vmvl4\" (UID: \"e0a2cf15-bfac-40d4-afde-e232be9ff1d4\") " pod="openshift-controller-manager/controller-manager-8fd4d6ccc-vmvl4" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.794560 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0a2cf15-bfac-40d4-afde-e232be9ff1d4-client-ca\") pod \"controller-manager-8fd4d6ccc-vmvl4\" (UID: \"e0a2cf15-bfac-40d4-afde-e232be9ff1d4\") " pod="openshift-controller-manager/controller-manager-8fd4d6ccc-vmvl4" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.895698 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f12872f2-179d-4c0f-8eaf-650e37595318-serving-cert\") pod \"route-controller-manager-5ffdbbcc98-7szm6\" (UID: \"f12872f2-179d-4c0f-8eaf-650e37595318\") " pod="openshift-route-controller-manager/route-controller-manager-5ffdbbcc98-7szm6" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.895749 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0a2cf15-bfac-40d4-afde-e232be9ff1d4-config\") pod \"controller-manager-8fd4d6ccc-vmvl4\" (UID: \"e0a2cf15-bfac-40d4-afde-e232be9ff1d4\") " pod="openshift-controller-manager/controller-manager-8fd4d6ccc-vmvl4" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.895780 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gc4d4\" (UniqueName: \"kubernetes.io/projected/e0a2cf15-bfac-40d4-afde-e232be9ff1d4-kube-api-access-gc4d4\") pod \"controller-manager-8fd4d6ccc-vmvl4\" (UID: \"e0a2cf15-bfac-40d4-afde-e232be9ff1d4\") " pod="openshift-controller-manager/controller-manager-8fd4d6ccc-vmvl4" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.895804 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f12872f2-179d-4c0f-8eaf-650e37595318-client-ca\") pod \"route-controller-manager-5ffdbbcc98-7szm6\" (UID: \"f12872f2-179d-4c0f-8eaf-650e37595318\") " pod="openshift-route-controller-manager/route-controller-manager-5ffdbbcc98-7szm6" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.895826 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0a2cf15-bfac-40d4-afde-e232be9ff1d4-client-ca\") pod \"controller-manager-8fd4d6ccc-vmvl4\" (UID: \"e0a2cf15-bfac-40d4-afde-e232be9ff1d4\") " pod="openshift-controller-manager/controller-manager-8fd4d6ccc-vmvl4" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.895843 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vdnn\" (UniqueName: \"kubernetes.io/projected/f12872f2-179d-4c0f-8eaf-650e37595318-kube-api-access-7vdnn\") pod \"route-controller-manager-5ffdbbcc98-7szm6\" (UID: \"f12872f2-179d-4c0f-8eaf-650e37595318\") " pod="openshift-route-controller-manager/route-controller-manager-5ffdbbcc98-7szm6" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.895891 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0a2cf15-bfac-40d4-afde-e232be9ff1d4-proxy-ca-bundles\") pod \"controller-manager-8fd4d6ccc-vmvl4\" (UID: \"e0a2cf15-bfac-40d4-afde-e232be9ff1d4\") " pod="openshift-controller-manager/controller-manager-8fd4d6ccc-vmvl4" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.895907 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0a2cf15-bfac-40d4-afde-e232be9ff1d4-serving-cert\") pod \"controller-manager-8fd4d6ccc-vmvl4\" (UID: \"e0a2cf15-bfac-40d4-afde-e232be9ff1d4\") " pod="openshift-controller-manager/controller-manager-8fd4d6ccc-vmvl4" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.896000 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f12872f2-179d-4c0f-8eaf-650e37595318-config\") pod \"route-controller-manager-5ffdbbcc98-7szm6\" (UID: \"f12872f2-179d-4c0f-8eaf-650e37595318\") " pod="openshift-route-controller-manager/route-controller-manager-5ffdbbcc98-7szm6" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.897233 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0a2cf15-bfac-40d4-afde-e232be9ff1d4-proxy-ca-bundles\") pod \"controller-manager-8fd4d6ccc-vmvl4\" (UID: \"e0a2cf15-bfac-40d4-afde-e232be9ff1d4\") " pod="openshift-controller-manager/controller-manager-8fd4d6ccc-vmvl4" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.897239 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0a2cf15-bfac-40d4-afde-e232be9ff1d4-client-ca\") pod \"controller-manager-8fd4d6ccc-vmvl4\" (UID: \"e0a2cf15-bfac-40d4-afde-e232be9ff1d4\") " pod="openshift-controller-manager/controller-manager-8fd4d6ccc-vmvl4" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.899079 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0a2cf15-bfac-40d4-afde-e232be9ff1d4-config\") pod \"controller-manager-8fd4d6ccc-vmvl4\" (UID: \"e0a2cf15-bfac-40d4-afde-e232be9ff1d4\") " pod="openshift-controller-manager/controller-manager-8fd4d6ccc-vmvl4" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.900196 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0a2cf15-bfac-40d4-afde-e232be9ff1d4-serving-cert\") pod \"controller-manager-8fd4d6ccc-vmvl4\" (UID: \"e0a2cf15-bfac-40d4-afde-e232be9ff1d4\") " pod="openshift-controller-manager/controller-manager-8fd4d6ccc-vmvl4" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.923745 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc4d4\" (UniqueName: \"kubernetes.io/projected/e0a2cf15-bfac-40d4-afde-e232be9ff1d4-kube-api-access-gc4d4\") pod \"controller-manager-8fd4d6ccc-vmvl4\" (UID: \"e0a2cf15-bfac-40d4-afde-e232be9ff1d4\") " pod="openshift-controller-manager/controller-manager-8fd4d6ccc-vmvl4" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.997532 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f12872f2-179d-4c0f-8eaf-650e37595318-config\") pod \"route-controller-manager-5ffdbbcc98-7szm6\" (UID: \"f12872f2-179d-4c0f-8eaf-650e37595318\") " pod="openshift-route-controller-manager/route-controller-manager-5ffdbbcc98-7szm6" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.998903 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f12872f2-179d-4c0f-8eaf-650e37595318-serving-cert\") pod \"route-controller-manager-5ffdbbcc98-7szm6\" (UID: \"f12872f2-179d-4c0f-8eaf-650e37595318\") " pod="openshift-route-controller-manager/route-controller-manager-5ffdbbcc98-7szm6" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.998819 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f12872f2-179d-4c0f-8eaf-650e37595318-config\") pod \"route-controller-manager-5ffdbbcc98-7szm6\" (UID: \"f12872f2-179d-4c0f-8eaf-650e37595318\") " pod="openshift-route-controller-manager/route-controller-manager-5ffdbbcc98-7szm6" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.999003 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f12872f2-179d-4c0f-8eaf-650e37595318-client-ca\") pod \"route-controller-manager-5ffdbbcc98-7szm6\" (UID: \"f12872f2-179d-4c0f-8eaf-650e37595318\") " pod="openshift-route-controller-manager/route-controller-manager-5ffdbbcc98-7szm6" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.999510 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vdnn\" (UniqueName: \"kubernetes.io/projected/f12872f2-179d-4c0f-8eaf-650e37595318-kube-api-access-7vdnn\") pod \"route-controller-manager-5ffdbbcc98-7szm6\" (UID: \"f12872f2-179d-4c0f-8eaf-650e37595318\") " pod="openshift-route-controller-manager/route-controller-manager-5ffdbbcc98-7szm6" Oct 03 14:55:34 crc kubenswrapper[4774]: I1003 14:55:34.999950 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f12872f2-179d-4c0f-8eaf-650e37595318-client-ca\") pod \"route-controller-manager-5ffdbbcc98-7szm6\" (UID: \"f12872f2-179d-4c0f-8eaf-650e37595318\") " pod="openshift-route-controller-manager/route-controller-manager-5ffdbbcc98-7szm6" Oct 03 14:55:35 crc kubenswrapper[4774]: I1003 14:55:35.005972 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f12872f2-179d-4c0f-8eaf-650e37595318-serving-cert\") pod \"route-controller-manager-5ffdbbcc98-7szm6\" (UID: \"f12872f2-179d-4c0f-8eaf-650e37595318\") " pod="openshift-route-controller-manager/route-controller-manager-5ffdbbcc98-7szm6" Oct 03 14:55:35 crc kubenswrapper[4774]: I1003 14:55:35.017901 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vdnn\" (UniqueName: \"kubernetes.io/projected/f12872f2-179d-4c0f-8eaf-650e37595318-kube-api-access-7vdnn\") pod \"route-controller-manager-5ffdbbcc98-7szm6\" (UID: \"f12872f2-179d-4c0f-8eaf-650e37595318\") " pod="openshift-route-controller-manager/route-controller-manager-5ffdbbcc98-7szm6" Oct 03 14:55:35 crc kubenswrapper[4774]: I1003 14:55:35.024865 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8fd4d6ccc-vmvl4" Oct 03 14:55:35 crc kubenswrapper[4774]: I1003 14:55:35.049565 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5ffdbbcc98-7szm6" Oct 03 14:55:35 crc kubenswrapper[4774]: I1003 14:55:35.311027 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="004c9445-7b3b-4479-bf6e-e6d880e4c7bb" path="/var/lib/kubelet/pods/004c9445-7b3b-4479-bf6e-e6d880e4c7bb/volumes" Oct 03 14:55:35 crc kubenswrapper[4774]: I1003 14:55:35.311970 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="936c81dd-638d-4776-86fe-54a8aa53e50e" path="/var/lib/kubelet/pods/936c81dd-638d-4776-86fe-54a8aa53e50e/volumes" Oct 03 14:55:35 crc kubenswrapper[4774]: I1003 14:55:35.591650 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8fd4d6ccc-vmvl4"] Oct 03 14:55:35 crc kubenswrapper[4774]: I1003 14:55:35.633771 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5ffdbbcc98-7szm6"] Oct 03 14:55:35 crc kubenswrapper[4774]: W1003 14:55:35.635003 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf12872f2_179d_4c0f_8eaf_650e37595318.slice/crio-25b36fc6a79e5788652013577839dec8c19353caa3541c320dcf0864f05d2606 WatchSource:0}: Error finding container 25b36fc6a79e5788652013577839dec8c19353caa3541c320dcf0864f05d2606: Status 404 returned error can't find the container with id 25b36fc6a79e5788652013577839dec8c19353caa3541c320dcf0864f05d2606 Oct 03 14:55:36 crc kubenswrapper[4774]: I1003 14:55:36.434248 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8fd4d6ccc-vmvl4" event={"ID":"e0a2cf15-bfac-40d4-afde-e232be9ff1d4","Type":"ContainerStarted","Data":"1d3876e3434d41d3501401acb943c1cfa00063d2ff91b14d1b805dda172b8fc2"} Oct 03 14:55:36 crc kubenswrapper[4774]: I1003 14:55:36.434753 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8fd4d6ccc-vmvl4" event={"ID":"e0a2cf15-bfac-40d4-afde-e232be9ff1d4","Type":"ContainerStarted","Data":"ec5a6c3d373932ad2988394cb7a52058bea298ef5687bf8933440a9fc2585774"} Oct 03 14:55:36 crc kubenswrapper[4774]: I1003 14:55:36.435113 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-8fd4d6ccc-vmvl4" Oct 03 14:55:36 crc kubenswrapper[4774]: I1003 14:55:36.435855 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5ffdbbcc98-7szm6" event={"ID":"f12872f2-179d-4c0f-8eaf-650e37595318","Type":"ContainerStarted","Data":"96a8d37aaec8e0f546c82db781b5fea9e42fb2a4e1562b5418525ed2c1413b55"} Oct 03 14:55:36 crc kubenswrapper[4774]: I1003 14:55:36.435888 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5ffdbbcc98-7szm6" event={"ID":"f12872f2-179d-4c0f-8eaf-650e37595318","Type":"ContainerStarted","Data":"25b36fc6a79e5788652013577839dec8c19353caa3541c320dcf0864f05d2606"} Oct 03 14:55:36 crc kubenswrapper[4774]: I1003 14:55:36.436253 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5ffdbbcc98-7szm6" Oct 03 14:55:36 crc kubenswrapper[4774]: I1003 14:55:36.440934 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5ffdbbcc98-7szm6" Oct 03 14:55:36 crc kubenswrapper[4774]: I1003 14:55:36.441255 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-8fd4d6ccc-vmvl4" Oct 03 14:55:36 crc kubenswrapper[4774]: I1003 14:55:36.452394 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-8fd4d6ccc-vmvl4" podStartSLOduration=3.452339977 podStartE2EDuration="3.452339977s" podCreationTimestamp="2025-10-03 14:55:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:55:36.451031844 +0000 UTC m=+759.040235296" watchObservedRunningTime="2025-10-03 14:55:36.452339977 +0000 UTC m=+759.041543469" Oct 03 14:55:36 crc kubenswrapper[4774]: I1003 14:55:36.468472 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5ffdbbcc98-7szm6" podStartSLOduration=3.468450198 podStartE2EDuration="3.468450198s" podCreationTimestamp="2025-10-03 14:55:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:55:36.465235918 +0000 UTC m=+759.054439390" watchObservedRunningTime="2025-10-03 14:55:36.468450198 +0000 UTC m=+759.057653650" Oct 03 14:55:38 crc kubenswrapper[4774]: I1003 14:55:38.271480 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7f6cc8bc96-sdmhf"] Oct 03 14:55:38 crc kubenswrapper[4774]: I1003 14:55:38.272347 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7f6cc8bc96-sdmhf" Oct 03 14:55:38 crc kubenswrapper[4774]: I1003 14:55:38.277312 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 03 14:55:38 crc kubenswrapper[4774]: I1003 14:55:38.277312 4774 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 03 14:55:38 crc kubenswrapper[4774]: I1003 14:55:38.277450 4774 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 03 14:55:38 crc kubenswrapper[4774]: I1003 14:55:38.277702 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 03 14:55:38 crc kubenswrapper[4774]: I1003 14:55:38.280138 4774 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-gzb6p" Oct 03 14:55:38 crc kubenswrapper[4774]: I1003 14:55:38.296469 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7f6cc8bc96-sdmhf"] Oct 03 14:55:38 crc kubenswrapper[4774]: I1003 14:55:38.443565 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/87f70971-8510-4214-86dc-011aaf626b7a-apiservice-cert\") pod \"metallb-operator-controller-manager-7f6cc8bc96-sdmhf\" (UID: \"87f70971-8510-4214-86dc-011aaf626b7a\") " pod="metallb-system/metallb-operator-controller-manager-7f6cc8bc96-sdmhf" Oct 03 14:55:38 crc kubenswrapper[4774]: I1003 14:55:38.444699 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/87f70971-8510-4214-86dc-011aaf626b7a-webhook-cert\") pod \"metallb-operator-controller-manager-7f6cc8bc96-sdmhf\" (UID: \"87f70971-8510-4214-86dc-011aaf626b7a\") " pod="metallb-system/metallb-operator-controller-manager-7f6cc8bc96-sdmhf" Oct 03 14:55:38 crc kubenswrapper[4774]: I1003 14:55:38.445709 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f86d\" (UniqueName: \"kubernetes.io/projected/87f70971-8510-4214-86dc-011aaf626b7a-kube-api-access-9f86d\") pod \"metallb-operator-controller-manager-7f6cc8bc96-sdmhf\" (UID: \"87f70971-8510-4214-86dc-011aaf626b7a\") " pod="metallb-system/metallb-operator-controller-manager-7f6cc8bc96-sdmhf" Oct 03 14:55:38 crc kubenswrapper[4774]: I1003 14:55:38.508623 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-696d9855d4-wkrd4"] Oct 03 14:55:38 crc kubenswrapper[4774]: I1003 14:55:38.511207 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-696d9855d4-wkrd4" Oct 03 14:55:38 crc kubenswrapper[4774]: I1003 14:55:38.512656 4774 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-9h2nl" Oct 03 14:55:38 crc kubenswrapper[4774]: I1003 14:55:38.512906 4774 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 03 14:55:38 crc kubenswrapper[4774]: I1003 14:55:38.513190 4774 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 03 14:55:38 crc kubenswrapper[4774]: I1003 14:55:38.528415 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-696d9855d4-wkrd4"] Oct 03 14:55:38 crc kubenswrapper[4774]: I1003 14:55:38.548198 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/87f70971-8510-4214-86dc-011aaf626b7a-apiservice-cert\") pod \"metallb-operator-controller-manager-7f6cc8bc96-sdmhf\" (UID: \"87f70971-8510-4214-86dc-011aaf626b7a\") " pod="metallb-system/metallb-operator-controller-manager-7f6cc8bc96-sdmhf" Oct 03 14:55:38 crc kubenswrapper[4774]: I1003 14:55:38.548944 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/87f70971-8510-4214-86dc-011aaf626b7a-webhook-cert\") pod \"metallb-operator-controller-manager-7f6cc8bc96-sdmhf\" (UID: \"87f70971-8510-4214-86dc-011aaf626b7a\") " pod="metallb-system/metallb-operator-controller-manager-7f6cc8bc96-sdmhf" Oct 03 14:55:38 crc kubenswrapper[4774]: I1003 14:55:38.548970 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f86d\" (UniqueName: \"kubernetes.io/projected/87f70971-8510-4214-86dc-011aaf626b7a-kube-api-access-9f86d\") pod \"metallb-operator-controller-manager-7f6cc8bc96-sdmhf\" (UID: \"87f70971-8510-4214-86dc-011aaf626b7a\") " pod="metallb-system/metallb-operator-controller-manager-7f6cc8bc96-sdmhf" Oct 03 14:55:38 crc kubenswrapper[4774]: I1003 14:55:38.553972 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/87f70971-8510-4214-86dc-011aaf626b7a-apiservice-cert\") pod \"metallb-operator-controller-manager-7f6cc8bc96-sdmhf\" (UID: \"87f70971-8510-4214-86dc-011aaf626b7a\") " pod="metallb-system/metallb-operator-controller-manager-7f6cc8bc96-sdmhf" Oct 03 14:55:38 crc kubenswrapper[4774]: I1003 14:55:38.555077 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/87f70971-8510-4214-86dc-011aaf626b7a-webhook-cert\") pod \"metallb-operator-controller-manager-7f6cc8bc96-sdmhf\" (UID: \"87f70971-8510-4214-86dc-011aaf626b7a\") " pod="metallb-system/metallb-operator-controller-manager-7f6cc8bc96-sdmhf" Oct 03 14:55:38 crc kubenswrapper[4774]: I1003 14:55:38.574119 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f86d\" (UniqueName: \"kubernetes.io/projected/87f70971-8510-4214-86dc-011aaf626b7a-kube-api-access-9f86d\") pod \"metallb-operator-controller-manager-7f6cc8bc96-sdmhf\" (UID: \"87f70971-8510-4214-86dc-011aaf626b7a\") " pod="metallb-system/metallb-operator-controller-manager-7f6cc8bc96-sdmhf" Oct 03 14:55:38 crc kubenswrapper[4774]: I1003 14:55:38.588227 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7f6cc8bc96-sdmhf" Oct 03 14:55:38 crc kubenswrapper[4774]: I1003 14:55:38.650235 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/637291ee-7c78-4978-a59f-da3c5d284724-webhook-cert\") pod \"metallb-operator-webhook-server-696d9855d4-wkrd4\" (UID: \"637291ee-7c78-4978-a59f-da3c5d284724\") " pod="metallb-system/metallb-operator-webhook-server-696d9855d4-wkrd4" Oct 03 14:55:38 crc kubenswrapper[4774]: I1003 14:55:38.650291 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsk7b\" (UniqueName: \"kubernetes.io/projected/637291ee-7c78-4978-a59f-da3c5d284724-kube-api-access-fsk7b\") pod \"metallb-operator-webhook-server-696d9855d4-wkrd4\" (UID: \"637291ee-7c78-4978-a59f-da3c5d284724\") " pod="metallb-system/metallb-operator-webhook-server-696d9855d4-wkrd4" Oct 03 14:55:38 crc kubenswrapper[4774]: I1003 14:55:38.650346 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/637291ee-7c78-4978-a59f-da3c5d284724-apiservice-cert\") pod \"metallb-operator-webhook-server-696d9855d4-wkrd4\" (UID: \"637291ee-7c78-4978-a59f-da3c5d284724\") " pod="metallb-system/metallb-operator-webhook-server-696d9855d4-wkrd4" Oct 03 14:55:38 crc kubenswrapper[4774]: I1003 14:55:38.752138 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/637291ee-7c78-4978-a59f-da3c5d284724-webhook-cert\") pod \"metallb-operator-webhook-server-696d9855d4-wkrd4\" (UID: \"637291ee-7c78-4978-a59f-da3c5d284724\") " pod="metallb-system/metallb-operator-webhook-server-696d9855d4-wkrd4" Oct 03 14:55:38 crc kubenswrapper[4774]: I1003 14:55:38.752192 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsk7b\" (UniqueName: \"kubernetes.io/projected/637291ee-7c78-4978-a59f-da3c5d284724-kube-api-access-fsk7b\") pod \"metallb-operator-webhook-server-696d9855d4-wkrd4\" (UID: \"637291ee-7c78-4978-a59f-da3c5d284724\") " pod="metallb-system/metallb-operator-webhook-server-696d9855d4-wkrd4" Oct 03 14:55:38 crc kubenswrapper[4774]: I1003 14:55:38.752241 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/637291ee-7c78-4978-a59f-da3c5d284724-apiservice-cert\") pod \"metallb-operator-webhook-server-696d9855d4-wkrd4\" (UID: \"637291ee-7c78-4978-a59f-da3c5d284724\") " pod="metallb-system/metallb-operator-webhook-server-696d9855d4-wkrd4" Oct 03 14:55:38 crc kubenswrapper[4774]: I1003 14:55:38.757256 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/637291ee-7c78-4978-a59f-da3c5d284724-apiservice-cert\") pod \"metallb-operator-webhook-server-696d9855d4-wkrd4\" (UID: \"637291ee-7c78-4978-a59f-da3c5d284724\") " pod="metallb-system/metallb-operator-webhook-server-696d9855d4-wkrd4" Oct 03 14:55:38 crc kubenswrapper[4774]: I1003 14:55:38.772191 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsk7b\" (UniqueName: \"kubernetes.io/projected/637291ee-7c78-4978-a59f-da3c5d284724-kube-api-access-fsk7b\") pod \"metallb-operator-webhook-server-696d9855d4-wkrd4\" (UID: \"637291ee-7c78-4978-a59f-da3c5d284724\") " pod="metallb-system/metallb-operator-webhook-server-696d9855d4-wkrd4" Oct 03 14:55:38 crc kubenswrapper[4774]: I1003 14:55:38.773598 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/637291ee-7c78-4978-a59f-da3c5d284724-webhook-cert\") pod \"metallb-operator-webhook-server-696d9855d4-wkrd4\" (UID: \"637291ee-7c78-4978-a59f-da3c5d284724\") " pod="metallb-system/metallb-operator-webhook-server-696d9855d4-wkrd4" Oct 03 14:55:38 crc kubenswrapper[4774]: I1003 14:55:38.829908 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-696d9855d4-wkrd4" Oct 03 14:55:39 crc kubenswrapper[4774]: I1003 14:55:39.093361 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7f6cc8bc96-sdmhf"] Oct 03 14:55:39 crc kubenswrapper[4774]: W1003 14:55:39.101789 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87f70971_8510_4214_86dc_011aaf626b7a.slice/crio-482852a4d3966b7f38bbba9e2843007a8ae2a8fc1fcb2dbaeeba609becbf71e9 WatchSource:0}: Error finding container 482852a4d3966b7f38bbba9e2843007a8ae2a8fc1fcb2dbaeeba609becbf71e9: Status 404 returned error can't find the container with id 482852a4d3966b7f38bbba9e2843007a8ae2a8fc1fcb2dbaeeba609becbf71e9 Oct 03 14:55:39 crc kubenswrapper[4774]: I1003 14:55:39.238237 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-696d9855d4-wkrd4"] Oct 03 14:55:39 crc kubenswrapper[4774]: W1003 14:55:39.246167 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod637291ee_7c78_4978_a59f_da3c5d284724.slice/crio-8988972a5cda483123d0d55802ceefed11c2ed27554f97d490fbd34581e489cb WatchSource:0}: Error finding container 8988972a5cda483123d0d55802ceefed11c2ed27554f97d490fbd34581e489cb: Status 404 returned error can't find the container with id 8988972a5cda483123d0d55802ceefed11c2ed27554f97d490fbd34581e489cb Oct 03 14:55:39 crc kubenswrapper[4774]: I1003 14:55:39.455667 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7f6cc8bc96-sdmhf" event={"ID":"87f70971-8510-4214-86dc-011aaf626b7a","Type":"ContainerStarted","Data":"482852a4d3966b7f38bbba9e2843007a8ae2a8fc1fcb2dbaeeba609becbf71e9"} Oct 03 14:55:39 crc kubenswrapper[4774]: I1003 14:55:39.456857 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-696d9855d4-wkrd4" event={"ID":"637291ee-7c78-4978-a59f-da3c5d284724","Type":"ContainerStarted","Data":"8988972a5cda483123d0d55802ceefed11c2ed27554f97d490fbd34581e489cb"} Oct 03 14:55:42 crc kubenswrapper[4774]: I1003 14:55:42.163070 4774 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 03 14:55:44 crc kubenswrapper[4774]: I1003 14:55:44.490666 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7f6cc8bc96-sdmhf" event={"ID":"87f70971-8510-4214-86dc-011aaf626b7a","Type":"ContainerStarted","Data":"92b13fac4e6a53c9bb00cfde40a046c18a11cfa8a81881c3847a7ee63f60aa4e"} Oct 03 14:55:44 crc kubenswrapper[4774]: I1003 14:55:44.491209 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7f6cc8bc96-sdmhf" Oct 03 14:55:44 crc kubenswrapper[4774]: I1003 14:55:44.492957 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-696d9855d4-wkrd4" event={"ID":"637291ee-7c78-4978-a59f-da3c5d284724","Type":"ContainerStarted","Data":"e0e64d7d14087d92dce67b3a450d699e384d4b2c6ef7c6c62b802a57df763850"} Oct 03 14:55:44 crc kubenswrapper[4774]: I1003 14:55:44.493365 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-696d9855d4-wkrd4" Oct 03 14:55:44 crc kubenswrapper[4774]: I1003 14:55:44.522145 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7f6cc8bc96-sdmhf" podStartSLOduration=1.651481552 podStartE2EDuration="6.522114496s" podCreationTimestamp="2025-10-03 14:55:38 +0000 UTC" firstStartedPulling="2025-10-03 14:55:39.104052611 +0000 UTC m=+761.693256063" lastFinishedPulling="2025-10-03 14:55:43.974685555 +0000 UTC m=+766.563889007" observedRunningTime="2025-10-03 14:55:44.520025544 +0000 UTC m=+767.109229036" watchObservedRunningTime="2025-10-03 14:55:44.522114496 +0000 UTC m=+767.111317998" Oct 03 14:55:44 crc kubenswrapper[4774]: I1003 14:55:44.565215 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-696d9855d4-wkrd4" podStartSLOduration=1.765042481 podStartE2EDuration="6.565189599s" podCreationTimestamp="2025-10-03 14:55:38 +0000 UTC" firstStartedPulling="2025-10-03 14:55:39.24969111 +0000 UTC m=+761.838894562" lastFinishedPulling="2025-10-03 14:55:44.049838188 +0000 UTC m=+766.639041680" observedRunningTime="2025-10-03 14:55:44.558778659 +0000 UTC m=+767.147982141" watchObservedRunningTime="2025-10-03 14:55:44.565189599 +0000 UTC m=+767.154393061" Oct 03 14:55:50 crc kubenswrapper[4774]: I1003 14:55:50.653272 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:55:50 crc kubenswrapper[4774]: I1003 14:55:50.653884 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:55:58 crc kubenswrapper[4774]: I1003 14:55:58.836300 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-696d9855d4-wkrd4" Oct 03 14:56:18 crc kubenswrapper[4774]: I1003 14:56:18.591639 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7f6cc8bc96-sdmhf" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.347047 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-dtqfw"] Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.348168 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-dtqfw" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.350306 4774 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.350511 4774 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-89qm5" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.352960 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-vt9fs"] Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.356665 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-vt9fs" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.358070 4774 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.358351 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.364279 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-dtqfw"] Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.428863 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-xfjwm"] Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.429942 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-xfjwm" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.433487 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-qp2nq"] Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.434268 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-qp2nq" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.438165 4774 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.438218 4774 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.438409 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.438614 4774 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.438770 4774 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-k42f9" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.446711 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-qp2nq"] Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.484443 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ll9z\" (UniqueName: \"kubernetes.io/projected/cf48682e-2440-425a-bbd5-ebc1597e265d-kube-api-access-2ll9z\") pod \"frr-k8s-vt9fs\" (UID: \"cf48682e-2440-425a-bbd5-ebc1597e265d\") " pod="metallb-system/frr-k8s-vt9fs" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.484675 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c555a1f-be66-4efe-81ed-d2d90bd5e2f7-cert\") pod \"frr-k8s-webhook-server-64bf5d555-dtqfw\" (UID: \"9c555a1f-be66-4efe-81ed-d2d90bd5e2f7\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-dtqfw" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.484776 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cf48682e-2440-425a-bbd5-ebc1597e265d-metrics-certs\") pod \"frr-k8s-vt9fs\" (UID: \"cf48682e-2440-425a-bbd5-ebc1597e265d\") " pod="metallb-system/frr-k8s-vt9fs" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.484852 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/cf48682e-2440-425a-bbd5-ebc1597e265d-frr-conf\") pod \"frr-k8s-vt9fs\" (UID: \"cf48682e-2440-425a-bbd5-ebc1597e265d\") " pod="metallb-system/frr-k8s-vt9fs" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.485005 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/cf48682e-2440-425a-bbd5-ebc1597e265d-frr-sockets\") pod \"frr-k8s-vt9fs\" (UID: \"cf48682e-2440-425a-bbd5-ebc1597e265d\") " pod="metallb-system/frr-k8s-vt9fs" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.485499 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/cf48682e-2440-425a-bbd5-ebc1597e265d-reloader\") pod \"frr-k8s-vt9fs\" (UID: \"cf48682e-2440-425a-bbd5-ebc1597e265d\") " pod="metallb-system/frr-k8s-vt9fs" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.485581 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/cf48682e-2440-425a-bbd5-ebc1597e265d-frr-startup\") pod \"frr-k8s-vt9fs\" (UID: \"cf48682e-2440-425a-bbd5-ebc1597e265d\") " pod="metallb-system/frr-k8s-vt9fs" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.485669 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5fcp\" (UniqueName: \"kubernetes.io/projected/9c555a1f-be66-4efe-81ed-d2d90bd5e2f7-kube-api-access-t5fcp\") pod \"frr-k8s-webhook-server-64bf5d555-dtqfw\" (UID: \"9c555a1f-be66-4efe-81ed-d2d90bd5e2f7\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-dtqfw" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.485748 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/cf48682e-2440-425a-bbd5-ebc1597e265d-metrics\") pod \"frr-k8s-vt9fs\" (UID: \"cf48682e-2440-425a-bbd5-ebc1597e265d\") " pod="metallb-system/frr-k8s-vt9fs" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.587370 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e5a0c71d-7887-4a39-b427-221389fecc1e-memberlist\") pod \"speaker-xfjwm\" (UID: \"e5a0c71d-7887-4a39-b427-221389fecc1e\") " pod="metallb-system/speaker-xfjwm" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.587466 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjz7m\" (UniqueName: \"kubernetes.io/projected/e5a0c71d-7887-4a39-b427-221389fecc1e-kube-api-access-hjz7m\") pod \"speaker-xfjwm\" (UID: \"e5a0c71d-7887-4a39-b427-221389fecc1e\") " pod="metallb-system/speaker-xfjwm" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.587502 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cf48682e-2440-425a-bbd5-ebc1597e265d-metrics-certs\") pod \"frr-k8s-vt9fs\" (UID: \"cf48682e-2440-425a-bbd5-ebc1597e265d\") " pod="metallb-system/frr-k8s-vt9fs" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.587526 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/cf48682e-2440-425a-bbd5-ebc1597e265d-frr-conf\") pod \"frr-k8s-vt9fs\" (UID: \"cf48682e-2440-425a-bbd5-ebc1597e265d\") " pod="metallb-system/frr-k8s-vt9fs" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.587554 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/cf48682e-2440-425a-bbd5-ebc1597e265d-frr-sockets\") pod \"frr-k8s-vt9fs\" (UID: \"cf48682e-2440-425a-bbd5-ebc1597e265d\") " pod="metallb-system/frr-k8s-vt9fs" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.587581 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/cf48682e-2440-425a-bbd5-ebc1597e265d-reloader\") pod \"frr-k8s-vt9fs\" (UID: \"cf48682e-2440-425a-bbd5-ebc1597e265d\") " pod="metallb-system/frr-k8s-vt9fs" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.587600 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/cf48682e-2440-425a-bbd5-ebc1597e265d-frr-startup\") pod \"frr-k8s-vt9fs\" (UID: \"cf48682e-2440-425a-bbd5-ebc1597e265d\") " pod="metallb-system/frr-k8s-vt9fs" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.587630 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5fcp\" (UniqueName: \"kubernetes.io/projected/9c555a1f-be66-4efe-81ed-d2d90bd5e2f7-kube-api-access-t5fcp\") pod \"frr-k8s-webhook-server-64bf5d555-dtqfw\" (UID: \"9c555a1f-be66-4efe-81ed-d2d90bd5e2f7\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-dtqfw" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.587649 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxph8\" (UniqueName: \"kubernetes.io/projected/625d3121-98b6-42e6-bc58-ea4bbdc5a7ad-kube-api-access-xxph8\") pod \"controller-68d546b9d8-qp2nq\" (UID: \"625d3121-98b6-42e6-bc58-ea4bbdc5a7ad\") " pod="metallb-system/controller-68d546b9d8-qp2nq" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.587674 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/cf48682e-2440-425a-bbd5-ebc1597e265d-metrics\") pod \"frr-k8s-vt9fs\" (UID: \"cf48682e-2440-425a-bbd5-ebc1597e265d\") " pod="metallb-system/frr-k8s-vt9fs" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.587694 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/625d3121-98b6-42e6-bc58-ea4bbdc5a7ad-cert\") pod \"controller-68d546b9d8-qp2nq\" (UID: \"625d3121-98b6-42e6-bc58-ea4bbdc5a7ad\") " pod="metallb-system/controller-68d546b9d8-qp2nq" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.587732 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ll9z\" (UniqueName: \"kubernetes.io/projected/cf48682e-2440-425a-bbd5-ebc1597e265d-kube-api-access-2ll9z\") pod \"frr-k8s-vt9fs\" (UID: \"cf48682e-2440-425a-bbd5-ebc1597e265d\") " pod="metallb-system/frr-k8s-vt9fs" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.587753 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c555a1f-be66-4efe-81ed-d2d90bd5e2f7-cert\") pod \"frr-k8s-webhook-server-64bf5d555-dtqfw\" (UID: \"9c555a1f-be66-4efe-81ed-d2d90bd5e2f7\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-dtqfw" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.587775 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/e5a0c71d-7887-4a39-b427-221389fecc1e-metallb-excludel2\") pod \"speaker-xfjwm\" (UID: \"e5a0c71d-7887-4a39-b427-221389fecc1e\") " pod="metallb-system/speaker-xfjwm" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.587792 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/625d3121-98b6-42e6-bc58-ea4bbdc5a7ad-metrics-certs\") pod \"controller-68d546b9d8-qp2nq\" (UID: \"625d3121-98b6-42e6-bc58-ea4bbdc5a7ad\") " pod="metallb-system/controller-68d546b9d8-qp2nq" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.587829 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5a0c71d-7887-4a39-b427-221389fecc1e-metrics-certs\") pod \"speaker-xfjwm\" (UID: \"e5a0c71d-7887-4a39-b427-221389fecc1e\") " pod="metallb-system/speaker-xfjwm" Oct 03 14:56:19 crc kubenswrapper[4774]: E1003 14:56:19.587986 4774 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Oct 03 14:56:19 crc kubenswrapper[4774]: E1003 14:56:19.588062 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf48682e-2440-425a-bbd5-ebc1597e265d-metrics-certs podName:cf48682e-2440-425a-bbd5-ebc1597e265d nodeName:}" failed. No retries permitted until 2025-10-03 14:56:20.088041129 +0000 UTC m=+802.677244581 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cf48682e-2440-425a-bbd5-ebc1597e265d-metrics-certs") pod "frr-k8s-vt9fs" (UID: "cf48682e-2440-425a-bbd5-ebc1597e265d") : secret "frr-k8s-certs-secret" not found Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.588749 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/cf48682e-2440-425a-bbd5-ebc1597e265d-frr-conf\") pod \"frr-k8s-vt9fs\" (UID: \"cf48682e-2440-425a-bbd5-ebc1597e265d\") " pod="metallb-system/frr-k8s-vt9fs" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.588793 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/cf48682e-2440-425a-bbd5-ebc1597e265d-frr-sockets\") pod \"frr-k8s-vt9fs\" (UID: \"cf48682e-2440-425a-bbd5-ebc1597e265d\") " pod="metallb-system/frr-k8s-vt9fs" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.589447 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/cf48682e-2440-425a-bbd5-ebc1597e265d-reloader\") pod \"frr-k8s-vt9fs\" (UID: \"cf48682e-2440-425a-bbd5-ebc1597e265d\") " pod="metallb-system/frr-k8s-vt9fs" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.589470 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/cf48682e-2440-425a-bbd5-ebc1597e265d-metrics\") pod \"frr-k8s-vt9fs\" (UID: \"cf48682e-2440-425a-bbd5-ebc1597e265d\") " pod="metallb-system/frr-k8s-vt9fs" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.589925 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/cf48682e-2440-425a-bbd5-ebc1597e265d-frr-startup\") pod \"frr-k8s-vt9fs\" (UID: \"cf48682e-2440-425a-bbd5-ebc1597e265d\") " pod="metallb-system/frr-k8s-vt9fs" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.598162 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c555a1f-be66-4efe-81ed-d2d90bd5e2f7-cert\") pod \"frr-k8s-webhook-server-64bf5d555-dtqfw\" (UID: \"9c555a1f-be66-4efe-81ed-d2d90bd5e2f7\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-dtqfw" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.605993 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ll9z\" (UniqueName: \"kubernetes.io/projected/cf48682e-2440-425a-bbd5-ebc1597e265d-kube-api-access-2ll9z\") pod \"frr-k8s-vt9fs\" (UID: \"cf48682e-2440-425a-bbd5-ebc1597e265d\") " pod="metallb-system/frr-k8s-vt9fs" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.608996 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5fcp\" (UniqueName: \"kubernetes.io/projected/9c555a1f-be66-4efe-81ed-d2d90bd5e2f7-kube-api-access-t5fcp\") pod \"frr-k8s-webhook-server-64bf5d555-dtqfw\" (UID: \"9c555a1f-be66-4efe-81ed-d2d90bd5e2f7\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-dtqfw" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.666139 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-dtqfw" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.690513 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5a0c71d-7887-4a39-b427-221389fecc1e-metrics-certs\") pod \"speaker-xfjwm\" (UID: \"e5a0c71d-7887-4a39-b427-221389fecc1e\") " pod="metallb-system/speaker-xfjwm" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.690749 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e5a0c71d-7887-4a39-b427-221389fecc1e-memberlist\") pod \"speaker-xfjwm\" (UID: \"e5a0c71d-7887-4a39-b427-221389fecc1e\") " pod="metallb-system/speaker-xfjwm" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.690772 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjz7m\" (UniqueName: \"kubernetes.io/projected/e5a0c71d-7887-4a39-b427-221389fecc1e-kube-api-access-hjz7m\") pod \"speaker-xfjwm\" (UID: \"e5a0c71d-7887-4a39-b427-221389fecc1e\") " pod="metallb-system/speaker-xfjwm" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.690831 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxph8\" (UniqueName: \"kubernetes.io/projected/625d3121-98b6-42e6-bc58-ea4bbdc5a7ad-kube-api-access-xxph8\") pod \"controller-68d546b9d8-qp2nq\" (UID: \"625d3121-98b6-42e6-bc58-ea4bbdc5a7ad\") " pod="metallb-system/controller-68d546b9d8-qp2nq" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.690852 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/625d3121-98b6-42e6-bc58-ea4bbdc5a7ad-cert\") pod \"controller-68d546b9d8-qp2nq\" (UID: \"625d3121-98b6-42e6-bc58-ea4bbdc5a7ad\") " pod="metallb-system/controller-68d546b9d8-qp2nq" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.690883 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/625d3121-98b6-42e6-bc58-ea4bbdc5a7ad-metrics-certs\") pod \"controller-68d546b9d8-qp2nq\" (UID: \"625d3121-98b6-42e6-bc58-ea4bbdc5a7ad\") " pod="metallb-system/controller-68d546b9d8-qp2nq" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.690898 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/e5a0c71d-7887-4a39-b427-221389fecc1e-metallb-excludel2\") pod \"speaker-xfjwm\" (UID: \"e5a0c71d-7887-4a39-b427-221389fecc1e\") " pod="metallb-system/speaker-xfjwm" Oct 03 14:56:19 crc kubenswrapper[4774]: E1003 14:56:19.691249 4774 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 03 14:56:19 crc kubenswrapper[4774]: E1003 14:56:19.691326 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5a0c71d-7887-4a39-b427-221389fecc1e-memberlist podName:e5a0c71d-7887-4a39-b427-221389fecc1e nodeName:}" failed. No retries permitted until 2025-10-03 14:56:20.191303782 +0000 UTC m=+802.780507244 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/e5a0c71d-7887-4a39-b427-221389fecc1e-memberlist") pod "speaker-xfjwm" (UID: "e5a0c71d-7887-4a39-b427-221389fecc1e") : secret "metallb-memberlist" not found Oct 03 14:56:19 crc kubenswrapper[4774]: E1003 14:56:19.691484 4774 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.691523 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/e5a0c71d-7887-4a39-b427-221389fecc1e-metallb-excludel2\") pod \"speaker-xfjwm\" (UID: \"e5a0c71d-7887-4a39-b427-221389fecc1e\") " pod="metallb-system/speaker-xfjwm" Oct 03 14:56:19 crc kubenswrapper[4774]: E1003 14:56:19.691550 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/625d3121-98b6-42e6-bc58-ea4bbdc5a7ad-metrics-certs podName:625d3121-98b6-42e6-bc58-ea4bbdc5a7ad nodeName:}" failed. No retries permitted until 2025-10-03 14:56:20.191532528 +0000 UTC m=+802.780736010 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/625d3121-98b6-42e6-bc58-ea4bbdc5a7ad-metrics-certs") pod "controller-68d546b9d8-qp2nq" (UID: "625d3121-98b6-42e6-bc58-ea4bbdc5a7ad") : secret "controller-certs-secret" not found Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.693042 4774 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.694710 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5a0c71d-7887-4a39-b427-221389fecc1e-metrics-certs\") pod \"speaker-xfjwm\" (UID: \"e5a0c71d-7887-4a39-b427-221389fecc1e\") " pod="metallb-system/speaker-xfjwm" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.704324 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/625d3121-98b6-42e6-bc58-ea4bbdc5a7ad-cert\") pod \"controller-68d546b9d8-qp2nq\" (UID: \"625d3121-98b6-42e6-bc58-ea4bbdc5a7ad\") " pod="metallb-system/controller-68d546b9d8-qp2nq" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.709039 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjz7m\" (UniqueName: \"kubernetes.io/projected/e5a0c71d-7887-4a39-b427-221389fecc1e-kube-api-access-hjz7m\") pod \"speaker-xfjwm\" (UID: \"e5a0c71d-7887-4a39-b427-221389fecc1e\") " pod="metallb-system/speaker-xfjwm" Oct 03 14:56:19 crc kubenswrapper[4774]: I1003 14:56:19.715211 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxph8\" (UniqueName: \"kubernetes.io/projected/625d3121-98b6-42e6-bc58-ea4bbdc5a7ad-kube-api-access-xxph8\") pod \"controller-68d546b9d8-qp2nq\" (UID: \"625d3121-98b6-42e6-bc58-ea4bbdc5a7ad\") " pod="metallb-system/controller-68d546b9d8-qp2nq" Oct 03 14:56:20 crc kubenswrapper[4774]: I1003 14:56:20.102768 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cf48682e-2440-425a-bbd5-ebc1597e265d-metrics-certs\") pod \"frr-k8s-vt9fs\" (UID: \"cf48682e-2440-425a-bbd5-ebc1597e265d\") " pod="metallb-system/frr-k8s-vt9fs" Oct 03 14:56:20 crc kubenswrapper[4774]: I1003 14:56:20.107508 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cf48682e-2440-425a-bbd5-ebc1597e265d-metrics-certs\") pod \"frr-k8s-vt9fs\" (UID: \"cf48682e-2440-425a-bbd5-ebc1597e265d\") " pod="metallb-system/frr-k8s-vt9fs" Oct 03 14:56:20 crc kubenswrapper[4774]: I1003 14:56:20.136296 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-dtqfw"] Oct 03 14:56:20 crc kubenswrapper[4774]: I1003 14:56:20.203637 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/625d3121-98b6-42e6-bc58-ea4bbdc5a7ad-metrics-certs\") pod \"controller-68d546b9d8-qp2nq\" (UID: \"625d3121-98b6-42e6-bc58-ea4bbdc5a7ad\") " pod="metallb-system/controller-68d546b9d8-qp2nq" Oct 03 14:56:20 crc kubenswrapper[4774]: I1003 14:56:20.203696 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e5a0c71d-7887-4a39-b427-221389fecc1e-memberlist\") pod \"speaker-xfjwm\" (UID: \"e5a0c71d-7887-4a39-b427-221389fecc1e\") " pod="metallb-system/speaker-xfjwm" Oct 03 14:56:20 crc kubenswrapper[4774]: E1003 14:56:20.203897 4774 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 03 14:56:20 crc kubenswrapper[4774]: E1003 14:56:20.203954 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5a0c71d-7887-4a39-b427-221389fecc1e-memberlist podName:e5a0c71d-7887-4a39-b427-221389fecc1e nodeName:}" failed. No retries permitted until 2025-10-03 14:56:21.203939616 +0000 UTC m=+803.793143068 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/e5a0c71d-7887-4a39-b427-221389fecc1e-memberlist") pod "speaker-xfjwm" (UID: "e5a0c71d-7887-4a39-b427-221389fecc1e") : secret "metallb-memberlist" not found Oct 03 14:56:20 crc kubenswrapper[4774]: I1003 14:56:20.209965 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/625d3121-98b6-42e6-bc58-ea4bbdc5a7ad-metrics-certs\") pod \"controller-68d546b9d8-qp2nq\" (UID: \"625d3121-98b6-42e6-bc58-ea4bbdc5a7ad\") " pod="metallb-system/controller-68d546b9d8-qp2nq" Oct 03 14:56:20 crc kubenswrapper[4774]: I1003 14:56:20.279415 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-vt9fs" Oct 03 14:56:20 crc kubenswrapper[4774]: I1003 14:56:20.353233 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-qp2nq" Oct 03 14:56:20 crc kubenswrapper[4774]: I1003 14:56:20.654114 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:56:20 crc kubenswrapper[4774]: I1003 14:56:20.654194 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:56:20 crc kubenswrapper[4774]: I1003 14:56:20.654247 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" Oct 03 14:56:20 crc kubenswrapper[4774]: I1003 14:56:20.654937 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dd800268763caf1ba49e9f09998c3c8de0daa9481a442c7e1127db9996ab98ab"} pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 14:56:20 crc kubenswrapper[4774]: I1003 14:56:20.655001 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" containerID="cri-o://dd800268763caf1ba49e9f09998c3c8de0daa9481a442c7e1127db9996ab98ab" gracePeriod=600 Oct 03 14:56:20 crc kubenswrapper[4774]: I1003 14:56:20.701048 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vt9fs" event={"ID":"cf48682e-2440-425a-bbd5-ebc1597e265d","Type":"ContainerStarted","Data":"88dc0cbf4f9e668dddce2a30df5660669988f50655b5a28e1086047e98239362"} Oct 03 14:56:20 crc kubenswrapper[4774]: I1003 14:56:20.702018 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-dtqfw" event={"ID":"9c555a1f-be66-4efe-81ed-d2d90bd5e2f7","Type":"ContainerStarted","Data":"e34ff74d858f39264b9938662cafc1a06d1fc7b62fd2c5fd11c8b04d129a64e3"} Oct 03 14:56:20 crc kubenswrapper[4774]: I1003 14:56:20.758521 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-qp2nq"] Oct 03 14:56:21 crc kubenswrapper[4774]: I1003 14:56:21.217471 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e5a0c71d-7887-4a39-b427-221389fecc1e-memberlist\") pod \"speaker-xfjwm\" (UID: \"e5a0c71d-7887-4a39-b427-221389fecc1e\") " pod="metallb-system/speaker-xfjwm" Oct 03 14:56:21 crc kubenswrapper[4774]: I1003 14:56:21.222086 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e5a0c71d-7887-4a39-b427-221389fecc1e-memberlist\") pod \"speaker-xfjwm\" (UID: \"e5a0c71d-7887-4a39-b427-221389fecc1e\") " pod="metallb-system/speaker-xfjwm" Oct 03 14:56:21 crc kubenswrapper[4774]: I1003 14:56:21.246223 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-xfjwm" Oct 03 14:56:21 crc kubenswrapper[4774]: I1003 14:56:21.711609 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-xfjwm" event={"ID":"e5a0c71d-7887-4a39-b427-221389fecc1e","Type":"ContainerStarted","Data":"685a3284db996eb1e310472047ec38b654a73e933448d0816db93d909aba9391"} Oct 03 14:56:21 crc kubenswrapper[4774]: I1003 14:56:21.712270 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-xfjwm" event={"ID":"e5a0c71d-7887-4a39-b427-221389fecc1e","Type":"ContainerStarted","Data":"ad3facb74d221804c0ebfb0942beeaac4df9b67d0c2b1f2891390070bbc97ac6"} Oct 03 14:56:21 crc kubenswrapper[4774]: I1003 14:56:21.713453 4774 generic.go:334] "Generic (PLEG): container finished" podID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerID="dd800268763caf1ba49e9f09998c3c8de0daa9481a442c7e1127db9996ab98ab" exitCode=0 Oct 03 14:56:21 crc kubenswrapper[4774]: I1003 14:56:21.713571 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" event={"ID":"ca37ac4b-f421-4198-a179-12901d36f0f5","Type":"ContainerDied","Data":"dd800268763caf1ba49e9f09998c3c8de0daa9481a442c7e1127db9996ab98ab"} Oct 03 14:56:21 crc kubenswrapper[4774]: I1003 14:56:21.713609 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" event={"ID":"ca37ac4b-f421-4198-a179-12901d36f0f5","Type":"ContainerStarted","Data":"f6858585d748d7516503bd2f90216465db181a380255963647c750b70d73b203"} Oct 03 14:56:21 crc kubenswrapper[4774]: I1003 14:56:21.713628 4774 scope.go:117] "RemoveContainer" containerID="6084213ff4d2e0141161ed1ee2c6ddcac4069da8f5bf554f9b528002b00a69ce" Oct 03 14:56:21 crc kubenswrapper[4774]: I1003 14:56:21.717476 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-qp2nq" event={"ID":"625d3121-98b6-42e6-bc58-ea4bbdc5a7ad","Type":"ContainerStarted","Data":"aecedd0c0670414767bd833d52060036095df3978c0c514c8ebf3316b2c84c00"} Oct 03 14:56:21 crc kubenswrapper[4774]: I1003 14:56:21.717501 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-qp2nq" event={"ID":"625d3121-98b6-42e6-bc58-ea4bbdc5a7ad","Type":"ContainerStarted","Data":"714d99413ba893a9a11bb817073ee60fffce0bd62911c031d039e37f12cb06bf"} Oct 03 14:56:21 crc kubenswrapper[4774]: I1003 14:56:21.717511 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-qp2nq" event={"ID":"625d3121-98b6-42e6-bc58-ea4bbdc5a7ad","Type":"ContainerStarted","Data":"fa6e752a2c3fe893070a95a8b175ed10643a7d81244ce3c65f02bdcec1c0c478"} Oct 03 14:56:21 crc kubenswrapper[4774]: I1003 14:56:21.717654 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-qp2nq" Oct 03 14:56:21 crc kubenswrapper[4774]: I1003 14:56:21.759215 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-qp2nq" podStartSLOduration=2.759194109 podStartE2EDuration="2.759194109s" podCreationTimestamp="2025-10-03 14:56:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:56:21.746584824 +0000 UTC m=+804.335788286" watchObservedRunningTime="2025-10-03 14:56:21.759194109 +0000 UTC m=+804.348397601" Oct 03 14:56:22 crc kubenswrapper[4774]: I1003 14:56:22.726229 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-xfjwm" event={"ID":"e5a0c71d-7887-4a39-b427-221389fecc1e","Type":"ContainerStarted","Data":"f528f82a1f9af410a4ec8b1ed50fe10ee2982fb0bf62d7bfa3c1bbb30ce829fb"} Oct 03 14:56:22 crc kubenswrapper[4774]: I1003 14:56:22.727323 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-xfjwm" Oct 03 14:56:22 crc kubenswrapper[4774]: I1003 14:56:22.745109 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-xfjwm" podStartSLOduration=3.745091295 podStartE2EDuration="3.745091295s" podCreationTimestamp="2025-10-03 14:56:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:56:22.744940941 +0000 UTC m=+805.334144393" watchObservedRunningTime="2025-10-03 14:56:22.745091295 +0000 UTC m=+805.334294747" Oct 03 14:56:27 crc kubenswrapper[4774]: I1003 14:56:27.801740 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-dtqfw" event={"ID":"9c555a1f-be66-4efe-81ed-d2d90bd5e2f7","Type":"ContainerStarted","Data":"7afccae2b01b5e6ed620428a6f95ce5aee46d0dfe1de83e1c8816a97ad8f460d"} Oct 03 14:56:27 crc kubenswrapper[4774]: I1003 14:56:27.802422 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-dtqfw" Oct 03 14:56:27 crc kubenswrapper[4774]: I1003 14:56:27.803761 4774 generic.go:334] "Generic (PLEG): container finished" podID="cf48682e-2440-425a-bbd5-ebc1597e265d" containerID="8c525a153de4b4dbd239b181064422e38576f3815ec197e7acc2155b95f857e0" exitCode=0 Oct 03 14:56:27 crc kubenswrapper[4774]: I1003 14:56:27.803813 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vt9fs" event={"ID":"cf48682e-2440-425a-bbd5-ebc1597e265d","Type":"ContainerDied","Data":"8c525a153de4b4dbd239b181064422e38576f3815ec197e7acc2155b95f857e0"} Oct 03 14:56:27 crc kubenswrapper[4774]: I1003 14:56:27.821645 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-dtqfw" podStartSLOduration=1.978810373 podStartE2EDuration="8.821621869s" podCreationTimestamp="2025-10-03 14:56:19 +0000 UTC" firstStartedPulling="2025-10-03 14:56:20.139298425 +0000 UTC m=+802.728501917" lastFinishedPulling="2025-10-03 14:56:26.982109971 +0000 UTC m=+809.571313413" observedRunningTime="2025-10-03 14:56:27.817415084 +0000 UTC m=+810.406618536" watchObservedRunningTime="2025-10-03 14:56:27.821621869 +0000 UTC m=+810.410825321" Oct 03 14:56:28 crc kubenswrapper[4774]: I1003 14:56:28.812508 4774 generic.go:334] "Generic (PLEG): container finished" podID="cf48682e-2440-425a-bbd5-ebc1597e265d" containerID="0203f777f3443d22505f523c0ff8a9f04836b407b5d911b23040a35cdbba43d8" exitCode=0 Oct 03 14:56:28 crc kubenswrapper[4774]: I1003 14:56:28.812591 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vt9fs" event={"ID":"cf48682e-2440-425a-bbd5-ebc1597e265d","Type":"ContainerDied","Data":"0203f777f3443d22505f523c0ff8a9f04836b407b5d911b23040a35cdbba43d8"} Oct 03 14:56:29 crc kubenswrapper[4774]: I1003 14:56:29.820803 4774 generic.go:334] "Generic (PLEG): container finished" podID="cf48682e-2440-425a-bbd5-ebc1597e265d" containerID="a240313114180e83846ad428289c3f75bab5b83b40770c41ece41cdde23a9490" exitCode=0 Oct 03 14:56:29 crc kubenswrapper[4774]: I1003 14:56:29.820921 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vt9fs" event={"ID":"cf48682e-2440-425a-bbd5-ebc1597e265d","Type":"ContainerDied","Data":"a240313114180e83846ad428289c3f75bab5b83b40770c41ece41cdde23a9490"} Oct 03 14:56:30 crc kubenswrapper[4774]: I1003 14:56:30.366698 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-qp2nq" Oct 03 14:56:30 crc kubenswrapper[4774]: I1003 14:56:30.833696 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vt9fs" event={"ID":"cf48682e-2440-425a-bbd5-ebc1597e265d","Type":"ContainerStarted","Data":"62de4273d8c158e8799021c7047312bb1b5fbf4ee0c69269e9adef0ef3ebce19"} Oct 03 14:56:30 crc kubenswrapper[4774]: I1003 14:56:30.833765 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vt9fs" event={"ID":"cf48682e-2440-425a-bbd5-ebc1597e265d","Type":"ContainerStarted","Data":"e1a20bc2bea1480fd6da4c23738f220e1cd0731760977779262f461822c94653"} Oct 03 14:56:30 crc kubenswrapper[4774]: I1003 14:56:30.833791 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vt9fs" event={"ID":"cf48682e-2440-425a-bbd5-ebc1597e265d","Type":"ContainerStarted","Data":"518861bffe526f350185e45374ef57f636b9da02225735b092719f6015671fba"} Oct 03 14:56:30 crc kubenswrapper[4774]: I1003 14:56:30.833812 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vt9fs" event={"ID":"cf48682e-2440-425a-bbd5-ebc1597e265d","Type":"ContainerStarted","Data":"9eaeb83e64a34ed9f1ef1bde8bf64e73774c2cc0cc0a1f77f7a6794d8ffafb83"} Oct 03 14:56:30 crc kubenswrapper[4774]: I1003 14:56:30.833832 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vt9fs" event={"ID":"cf48682e-2440-425a-bbd5-ebc1597e265d","Type":"ContainerStarted","Data":"2770e7513df4616640358fbe6835073fe08e1dd8c1e5b0de5a464175eedd6d66"} Oct 03 14:56:30 crc kubenswrapper[4774]: I1003 14:56:30.833854 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vt9fs" event={"ID":"cf48682e-2440-425a-bbd5-ebc1597e265d","Type":"ContainerStarted","Data":"6c46ad52ff6d5deb0fe456168267a4106622248edccbe78f5511b93011ef06d5"} Oct 03 14:56:30 crc kubenswrapper[4774]: I1003 14:56:30.833916 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-vt9fs" Oct 03 14:56:30 crc kubenswrapper[4774]: I1003 14:56:30.866683 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-vt9fs" podStartSLOduration=5.317523196 podStartE2EDuration="11.866656034s" podCreationTimestamp="2025-10-03 14:56:19 +0000 UTC" firstStartedPulling="2025-10-03 14:56:20.430996984 +0000 UTC m=+803.020200436" lastFinishedPulling="2025-10-03 14:56:26.980129822 +0000 UTC m=+809.569333274" observedRunningTime="2025-10-03 14:56:30.860698625 +0000 UTC m=+813.449902097" watchObservedRunningTime="2025-10-03 14:56:30.866656034 +0000 UTC m=+813.455859526" Oct 03 14:56:31 crc kubenswrapper[4774]: I1003 14:56:31.250019 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-xfjwm" Oct 03 14:56:34 crc kubenswrapper[4774]: I1003 14:56:34.052150 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-hn45f"] Oct 03 14:56:34 crc kubenswrapper[4774]: I1003 14:56:34.053391 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hn45f" Oct 03 14:56:34 crc kubenswrapper[4774]: I1003 14:56:34.055431 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 03 14:56:34 crc kubenswrapper[4774]: I1003 14:56:34.056601 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-gn8sq" Oct 03 14:56:34 crc kubenswrapper[4774]: I1003 14:56:34.056660 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 03 14:56:34 crc kubenswrapper[4774]: I1003 14:56:34.101955 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hn45f"] Oct 03 14:56:34 crc kubenswrapper[4774]: I1003 14:56:34.189817 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x6wj\" (UniqueName: \"kubernetes.io/projected/80cd534f-1967-42d4-9274-3bbfcb2b0b72-kube-api-access-4x6wj\") pod \"openstack-operator-index-hn45f\" (UID: \"80cd534f-1967-42d4-9274-3bbfcb2b0b72\") " pod="openstack-operators/openstack-operator-index-hn45f" Oct 03 14:56:34 crc kubenswrapper[4774]: I1003 14:56:34.290846 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x6wj\" (UniqueName: \"kubernetes.io/projected/80cd534f-1967-42d4-9274-3bbfcb2b0b72-kube-api-access-4x6wj\") pod \"openstack-operator-index-hn45f\" (UID: \"80cd534f-1967-42d4-9274-3bbfcb2b0b72\") " pod="openstack-operators/openstack-operator-index-hn45f" Oct 03 14:56:34 crc kubenswrapper[4774]: I1003 14:56:34.307771 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x6wj\" (UniqueName: \"kubernetes.io/projected/80cd534f-1967-42d4-9274-3bbfcb2b0b72-kube-api-access-4x6wj\") pod \"openstack-operator-index-hn45f\" (UID: \"80cd534f-1967-42d4-9274-3bbfcb2b0b72\") " pod="openstack-operators/openstack-operator-index-hn45f" Oct 03 14:56:34 crc kubenswrapper[4774]: I1003 14:56:34.384965 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hn45f" Oct 03 14:56:34 crc kubenswrapper[4774]: I1003 14:56:34.850496 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hn45f"] Oct 03 14:56:34 crc kubenswrapper[4774]: W1003 14:56:34.862300 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80cd534f_1967_42d4_9274_3bbfcb2b0b72.slice/crio-48b3d7df3b22027544ad1f9830fcd1672e23a7c39a09ae6181f63a8f23a05721 WatchSource:0}: Error finding container 48b3d7df3b22027544ad1f9830fcd1672e23a7c39a09ae6181f63a8f23a05721: Status 404 returned error can't find the container with id 48b3d7df3b22027544ad1f9830fcd1672e23a7c39a09ae6181f63a8f23a05721 Oct 03 14:56:35 crc kubenswrapper[4774]: I1003 14:56:35.279813 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-vt9fs" Oct 03 14:56:35 crc kubenswrapper[4774]: I1003 14:56:35.342354 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-vt9fs" Oct 03 14:56:35 crc kubenswrapper[4774]: I1003 14:56:35.869594 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hn45f" event={"ID":"80cd534f-1967-42d4-9274-3bbfcb2b0b72","Type":"ContainerStarted","Data":"48b3d7df3b22027544ad1f9830fcd1672e23a7c39a09ae6181f63a8f23a05721"} Oct 03 14:56:37 crc kubenswrapper[4774]: I1003 14:56:37.433312 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-hn45f"] Oct 03 14:56:37 crc kubenswrapper[4774]: I1003 14:56:37.884461 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hn45f" event={"ID":"80cd534f-1967-42d4-9274-3bbfcb2b0b72","Type":"ContainerStarted","Data":"a32f5ef9abed2745afb408902ee01e29263410b59cf9a6cdbe7a8ebf6f3227e0"} Oct 03 14:56:37 crc kubenswrapper[4774]: I1003 14:56:37.904448 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-hn45f" podStartSLOduration=1.554182445 podStartE2EDuration="3.904325725s" podCreationTimestamp="2025-10-03 14:56:34 +0000 UTC" firstStartedPulling="2025-10-03 14:56:34.865121525 +0000 UTC m=+817.454325017" lastFinishedPulling="2025-10-03 14:56:37.215264845 +0000 UTC m=+819.804468297" observedRunningTime="2025-10-03 14:56:37.900955231 +0000 UTC m=+820.490158723" watchObservedRunningTime="2025-10-03 14:56:37.904325725 +0000 UTC m=+820.493529227" Oct 03 14:56:38 crc kubenswrapper[4774]: I1003 14:56:38.032273 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-2t96w"] Oct 03 14:56:38 crc kubenswrapper[4774]: I1003 14:56:38.035734 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2t96w" Oct 03 14:56:38 crc kubenswrapper[4774]: I1003 14:56:38.041578 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-2t96w"] Oct 03 14:56:38 crc kubenswrapper[4774]: I1003 14:56:38.138110 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kvgx\" (UniqueName: \"kubernetes.io/projected/a97efab2-9188-4828-a600-d346b724f1f9-kube-api-access-9kvgx\") pod \"openstack-operator-index-2t96w\" (UID: \"a97efab2-9188-4828-a600-d346b724f1f9\") " pod="openstack-operators/openstack-operator-index-2t96w" Oct 03 14:56:38 crc kubenswrapper[4774]: I1003 14:56:38.239746 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kvgx\" (UniqueName: \"kubernetes.io/projected/a97efab2-9188-4828-a600-d346b724f1f9-kube-api-access-9kvgx\") pod \"openstack-operator-index-2t96w\" (UID: \"a97efab2-9188-4828-a600-d346b724f1f9\") " pod="openstack-operators/openstack-operator-index-2t96w" Oct 03 14:56:38 crc kubenswrapper[4774]: I1003 14:56:38.271926 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kvgx\" (UniqueName: \"kubernetes.io/projected/a97efab2-9188-4828-a600-d346b724f1f9-kube-api-access-9kvgx\") pod \"openstack-operator-index-2t96w\" (UID: \"a97efab2-9188-4828-a600-d346b724f1f9\") " pod="openstack-operators/openstack-operator-index-2t96w" Oct 03 14:56:38 crc kubenswrapper[4774]: I1003 14:56:38.358105 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2t96w" Oct 03 14:56:38 crc kubenswrapper[4774]: I1003 14:56:38.842296 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-2t96w"] Oct 03 14:56:38 crc kubenswrapper[4774]: I1003 14:56:38.894950 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-hn45f" podUID="80cd534f-1967-42d4-9274-3bbfcb2b0b72" containerName="registry-server" containerID="cri-o://a32f5ef9abed2745afb408902ee01e29263410b59cf9a6cdbe7a8ebf6f3227e0" gracePeriod=2 Oct 03 14:56:38 crc kubenswrapper[4774]: I1003 14:56:38.896159 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2t96w" event={"ID":"a97efab2-9188-4828-a600-d346b724f1f9","Type":"ContainerStarted","Data":"39b6bb5b8e6df8c477c55c82b8a6998c3bf9e02d0c3f9df5cb126f97a2612f81"} Oct 03 14:56:39 crc kubenswrapper[4774]: I1003 14:56:39.339649 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hn45f" Oct 03 14:56:39 crc kubenswrapper[4774]: I1003 14:56:39.458799 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x6wj\" (UniqueName: \"kubernetes.io/projected/80cd534f-1967-42d4-9274-3bbfcb2b0b72-kube-api-access-4x6wj\") pod \"80cd534f-1967-42d4-9274-3bbfcb2b0b72\" (UID: \"80cd534f-1967-42d4-9274-3bbfcb2b0b72\") " Oct 03 14:56:39 crc kubenswrapper[4774]: I1003 14:56:39.464451 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80cd534f-1967-42d4-9274-3bbfcb2b0b72-kube-api-access-4x6wj" (OuterVolumeSpecName: "kube-api-access-4x6wj") pod "80cd534f-1967-42d4-9274-3bbfcb2b0b72" (UID: "80cd534f-1967-42d4-9274-3bbfcb2b0b72"). InnerVolumeSpecName "kube-api-access-4x6wj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:56:39 crc kubenswrapper[4774]: I1003 14:56:39.560595 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4x6wj\" (UniqueName: \"kubernetes.io/projected/80cd534f-1967-42d4-9274-3bbfcb2b0b72-kube-api-access-4x6wj\") on node \"crc\" DevicePath \"\"" Oct 03 14:56:39 crc kubenswrapper[4774]: I1003 14:56:39.676412 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-dtqfw" Oct 03 14:56:39 crc kubenswrapper[4774]: I1003 14:56:39.902982 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2t96w" event={"ID":"a97efab2-9188-4828-a600-d346b724f1f9","Type":"ContainerStarted","Data":"0d046475607dfcd30017e8bf1f2b41a1a63e2a1b73cae2477f4ef66276e1f7f1"} Oct 03 14:56:39 crc kubenswrapper[4774]: I1003 14:56:39.907150 4774 generic.go:334] "Generic (PLEG): container finished" podID="80cd534f-1967-42d4-9274-3bbfcb2b0b72" containerID="a32f5ef9abed2745afb408902ee01e29263410b59cf9a6cdbe7a8ebf6f3227e0" exitCode=0 Oct 03 14:56:39 crc kubenswrapper[4774]: I1003 14:56:39.907292 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hn45f" event={"ID":"80cd534f-1967-42d4-9274-3bbfcb2b0b72","Type":"ContainerDied","Data":"a32f5ef9abed2745afb408902ee01e29263410b59cf9a6cdbe7a8ebf6f3227e0"} Oct 03 14:56:39 crc kubenswrapper[4774]: I1003 14:56:39.907448 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hn45f" event={"ID":"80cd534f-1967-42d4-9274-3bbfcb2b0b72","Type":"ContainerDied","Data":"48b3d7df3b22027544ad1f9830fcd1672e23a7c39a09ae6181f63a8f23a05721"} Oct 03 14:56:39 crc kubenswrapper[4774]: I1003 14:56:39.907628 4774 scope.go:117] "RemoveContainer" containerID="a32f5ef9abed2745afb408902ee01e29263410b59cf9a6cdbe7a8ebf6f3227e0" Oct 03 14:56:39 crc kubenswrapper[4774]: I1003 14:56:39.907664 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hn45f" Oct 03 14:56:39 crc kubenswrapper[4774]: I1003 14:56:39.924805 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-2t96w" podStartSLOduration=1.859211575 podStartE2EDuration="1.924787929s" podCreationTimestamp="2025-10-03 14:56:38 +0000 UTC" firstStartedPulling="2025-10-03 14:56:38.849956557 +0000 UTC m=+821.439160039" lastFinishedPulling="2025-10-03 14:56:38.915532931 +0000 UTC m=+821.504736393" observedRunningTime="2025-10-03 14:56:39.921010075 +0000 UTC m=+822.510213527" watchObservedRunningTime="2025-10-03 14:56:39.924787929 +0000 UTC m=+822.513991381" Oct 03 14:56:39 crc kubenswrapper[4774]: I1003 14:56:39.936290 4774 scope.go:117] "RemoveContainer" containerID="a32f5ef9abed2745afb408902ee01e29263410b59cf9a6cdbe7a8ebf6f3227e0" Oct 03 14:56:39 crc kubenswrapper[4774]: E1003 14:56:39.936746 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a32f5ef9abed2745afb408902ee01e29263410b59cf9a6cdbe7a8ebf6f3227e0\": container with ID starting with a32f5ef9abed2745afb408902ee01e29263410b59cf9a6cdbe7a8ebf6f3227e0 not found: ID does not exist" containerID="a32f5ef9abed2745afb408902ee01e29263410b59cf9a6cdbe7a8ebf6f3227e0" Oct 03 14:56:39 crc kubenswrapper[4774]: I1003 14:56:39.936794 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a32f5ef9abed2745afb408902ee01e29263410b59cf9a6cdbe7a8ebf6f3227e0"} err="failed to get container status \"a32f5ef9abed2745afb408902ee01e29263410b59cf9a6cdbe7a8ebf6f3227e0\": rpc error: code = NotFound desc = could not find container \"a32f5ef9abed2745afb408902ee01e29263410b59cf9a6cdbe7a8ebf6f3227e0\": container with ID starting with a32f5ef9abed2745afb408902ee01e29263410b59cf9a6cdbe7a8ebf6f3227e0 not found: ID does not exist" Oct 03 14:56:39 crc kubenswrapper[4774]: I1003 14:56:39.948324 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-hn45f"] Oct 03 14:56:39 crc kubenswrapper[4774]: I1003 14:56:39.951619 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-hn45f"] Oct 03 14:56:40 crc kubenswrapper[4774]: I1003 14:56:40.284631 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-vt9fs" Oct 03 14:56:41 crc kubenswrapper[4774]: I1003 14:56:41.307035 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80cd534f-1967-42d4-9274-3bbfcb2b0b72" path="/var/lib/kubelet/pods/80cd534f-1967-42d4-9274-3bbfcb2b0b72/volumes" Oct 03 14:56:48 crc kubenswrapper[4774]: I1003 14:56:48.359281 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-2t96w" Oct 03 14:56:48 crc kubenswrapper[4774]: I1003 14:56:48.359756 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-2t96w" Oct 03 14:56:48 crc kubenswrapper[4774]: I1003 14:56:48.387181 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-2t96w" Oct 03 14:56:49 crc kubenswrapper[4774]: I1003 14:56:49.014282 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-2t96w" Oct 03 14:56:56 crc kubenswrapper[4774]: I1003 14:56:56.334300 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh"] Oct 03 14:56:56 crc kubenswrapper[4774]: E1003 14:56:56.335136 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80cd534f-1967-42d4-9274-3bbfcb2b0b72" containerName="registry-server" Oct 03 14:56:56 crc kubenswrapper[4774]: I1003 14:56:56.335155 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="80cd534f-1967-42d4-9274-3bbfcb2b0b72" containerName="registry-server" Oct 03 14:56:56 crc kubenswrapper[4774]: I1003 14:56:56.335352 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="80cd534f-1967-42d4-9274-3bbfcb2b0b72" containerName="registry-server" Oct 03 14:56:56 crc kubenswrapper[4774]: I1003 14:56:56.336622 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh" Oct 03 14:56:56 crc kubenswrapper[4774]: I1003 14:56:56.338881 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-6km27" Oct 03 14:56:56 crc kubenswrapper[4774]: I1003 14:56:56.346734 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh"] Oct 03 14:56:56 crc kubenswrapper[4774]: I1003 14:56:56.404938 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/128ae2e6-72c5-44ef-bf0a-6f54d80796cd-util\") pod \"ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh\" (UID: \"128ae2e6-72c5-44ef-bf0a-6f54d80796cd\") " pod="openstack-operators/ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh" Oct 03 14:56:56 crc kubenswrapper[4774]: I1003 14:56:56.405119 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/128ae2e6-72c5-44ef-bf0a-6f54d80796cd-bundle\") pod \"ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh\" (UID: \"128ae2e6-72c5-44ef-bf0a-6f54d80796cd\") " pod="openstack-operators/ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh" Oct 03 14:56:56 crc kubenswrapper[4774]: I1003 14:56:56.405173 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsscl\" (UniqueName: \"kubernetes.io/projected/128ae2e6-72c5-44ef-bf0a-6f54d80796cd-kube-api-access-qsscl\") pod \"ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh\" (UID: \"128ae2e6-72c5-44ef-bf0a-6f54d80796cd\") " pod="openstack-operators/ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh" Oct 03 14:56:56 crc kubenswrapper[4774]: I1003 14:56:56.506467 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/128ae2e6-72c5-44ef-bf0a-6f54d80796cd-bundle\") pod \"ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh\" (UID: \"128ae2e6-72c5-44ef-bf0a-6f54d80796cd\") " pod="openstack-operators/ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh" Oct 03 14:56:56 crc kubenswrapper[4774]: I1003 14:56:56.506527 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsscl\" (UniqueName: \"kubernetes.io/projected/128ae2e6-72c5-44ef-bf0a-6f54d80796cd-kube-api-access-qsscl\") pod \"ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh\" (UID: \"128ae2e6-72c5-44ef-bf0a-6f54d80796cd\") " pod="openstack-operators/ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh" Oct 03 14:56:56 crc kubenswrapper[4774]: I1003 14:56:56.506572 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/128ae2e6-72c5-44ef-bf0a-6f54d80796cd-util\") pod \"ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh\" (UID: \"128ae2e6-72c5-44ef-bf0a-6f54d80796cd\") " pod="openstack-operators/ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh" Oct 03 14:56:56 crc kubenswrapper[4774]: I1003 14:56:56.507000 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/128ae2e6-72c5-44ef-bf0a-6f54d80796cd-bundle\") pod \"ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh\" (UID: \"128ae2e6-72c5-44ef-bf0a-6f54d80796cd\") " pod="openstack-operators/ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh" Oct 03 14:56:56 crc kubenswrapper[4774]: I1003 14:56:56.507092 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/128ae2e6-72c5-44ef-bf0a-6f54d80796cd-util\") pod \"ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh\" (UID: \"128ae2e6-72c5-44ef-bf0a-6f54d80796cd\") " pod="openstack-operators/ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh" Oct 03 14:56:56 crc kubenswrapper[4774]: I1003 14:56:56.525117 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsscl\" (UniqueName: \"kubernetes.io/projected/128ae2e6-72c5-44ef-bf0a-6f54d80796cd-kube-api-access-qsscl\") pod \"ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh\" (UID: \"128ae2e6-72c5-44ef-bf0a-6f54d80796cd\") " pod="openstack-operators/ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh" Oct 03 14:56:56 crc kubenswrapper[4774]: I1003 14:56:56.670517 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh" Oct 03 14:56:56 crc kubenswrapper[4774]: I1003 14:56:56.918901 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh"] Oct 03 14:56:56 crc kubenswrapper[4774]: W1003 14:56:56.925902 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod128ae2e6_72c5_44ef_bf0a_6f54d80796cd.slice/crio-5285ae03e4d3c98a478719a7041000ad6757a6e3ff4cdac81af85f8ac4cded26 WatchSource:0}: Error finding container 5285ae03e4d3c98a478719a7041000ad6757a6e3ff4cdac81af85f8ac4cded26: Status 404 returned error can't find the container with id 5285ae03e4d3c98a478719a7041000ad6757a6e3ff4cdac81af85f8ac4cded26 Oct 03 14:56:57 crc kubenswrapper[4774]: I1003 14:56:57.043126 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh" event={"ID":"128ae2e6-72c5-44ef-bf0a-6f54d80796cd","Type":"ContainerStarted","Data":"5285ae03e4d3c98a478719a7041000ad6757a6e3ff4cdac81af85f8ac4cded26"} Oct 03 14:56:58 crc kubenswrapper[4774]: I1003 14:56:58.054037 4774 generic.go:334] "Generic (PLEG): container finished" podID="128ae2e6-72c5-44ef-bf0a-6f54d80796cd" containerID="c77402e06198128939fc112e07a42b6517410352c6f16e6c65badfb2a06364fe" exitCode=0 Oct 03 14:56:58 crc kubenswrapper[4774]: I1003 14:56:58.054175 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh" event={"ID":"128ae2e6-72c5-44ef-bf0a-6f54d80796cd","Type":"ContainerDied","Data":"c77402e06198128939fc112e07a42b6517410352c6f16e6c65badfb2a06364fe"} Oct 03 14:56:59 crc kubenswrapper[4774]: I1003 14:56:59.069351 4774 generic.go:334] "Generic (PLEG): container finished" podID="128ae2e6-72c5-44ef-bf0a-6f54d80796cd" containerID="94a542a003324e30b36f431f3d38e9b8d45e6c6664155376c51aed064c938d75" exitCode=0 Oct 03 14:56:59 crc kubenswrapper[4774]: I1003 14:56:59.069434 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh" event={"ID":"128ae2e6-72c5-44ef-bf0a-6f54d80796cd","Type":"ContainerDied","Data":"94a542a003324e30b36f431f3d38e9b8d45e6c6664155376c51aed064c938d75"} Oct 03 14:57:00 crc kubenswrapper[4774]: I1003 14:57:00.081396 4774 generic.go:334] "Generic (PLEG): container finished" podID="128ae2e6-72c5-44ef-bf0a-6f54d80796cd" containerID="9b8c07d62c75e447b317f961fd7d38f427c796597909dd24d3ab399bfdb03dab" exitCode=0 Oct 03 14:57:00 crc kubenswrapper[4774]: I1003 14:57:00.081429 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh" event={"ID":"128ae2e6-72c5-44ef-bf0a-6f54d80796cd","Type":"ContainerDied","Data":"9b8c07d62c75e447b317f961fd7d38f427c796597909dd24d3ab399bfdb03dab"} Oct 03 14:57:01 crc kubenswrapper[4774]: I1003 14:57:01.473589 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh" Oct 03 14:57:01 crc kubenswrapper[4774]: I1003 14:57:01.577613 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/128ae2e6-72c5-44ef-bf0a-6f54d80796cd-util\") pod \"128ae2e6-72c5-44ef-bf0a-6f54d80796cd\" (UID: \"128ae2e6-72c5-44ef-bf0a-6f54d80796cd\") " Oct 03 14:57:01 crc kubenswrapper[4774]: I1003 14:57:01.577687 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/128ae2e6-72c5-44ef-bf0a-6f54d80796cd-bundle\") pod \"128ae2e6-72c5-44ef-bf0a-6f54d80796cd\" (UID: \"128ae2e6-72c5-44ef-bf0a-6f54d80796cd\") " Oct 03 14:57:01 crc kubenswrapper[4774]: I1003 14:57:01.577770 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsscl\" (UniqueName: \"kubernetes.io/projected/128ae2e6-72c5-44ef-bf0a-6f54d80796cd-kube-api-access-qsscl\") pod \"128ae2e6-72c5-44ef-bf0a-6f54d80796cd\" (UID: \"128ae2e6-72c5-44ef-bf0a-6f54d80796cd\") " Oct 03 14:57:01 crc kubenswrapper[4774]: I1003 14:57:01.578847 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/128ae2e6-72c5-44ef-bf0a-6f54d80796cd-bundle" (OuterVolumeSpecName: "bundle") pod "128ae2e6-72c5-44ef-bf0a-6f54d80796cd" (UID: "128ae2e6-72c5-44ef-bf0a-6f54d80796cd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:57:01 crc kubenswrapper[4774]: I1003 14:57:01.583233 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/128ae2e6-72c5-44ef-bf0a-6f54d80796cd-kube-api-access-qsscl" (OuterVolumeSpecName: "kube-api-access-qsscl") pod "128ae2e6-72c5-44ef-bf0a-6f54d80796cd" (UID: "128ae2e6-72c5-44ef-bf0a-6f54d80796cd"). InnerVolumeSpecName "kube-api-access-qsscl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:57:01 crc kubenswrapper[4774]: I1003 14:57:01.591377 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/128ae2e6-72c5-44ef-bf0a-6f54d80796cd-util" (OuterVolumeSpecName: "util") pod "128ae2e6-72c5-44ef-bf0a-6f54d80796cd" (UID: "128ae2e6-72c5-44ef-bf0a-6f54d80796cd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:57:01 crc kubenswrapper[4774]: I1003 14:57:01.680016 4774 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/128ae2e6-72c5-44ef-bf0a-6f54d80796cd-util\") on node \"crc\" DevicePath \"\"" Oct 03 14:57:01 crc kubenswrapper[4774]: I1003 14:57:01.680071 4774 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/128ae2e6-72c5-44ef-bf0a-6f54d80796cd-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:57:01 crc kubenswrapper[4774]: I1003 14:57:01.680091 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsscl\" (UniqueName: \"kubernetes.io/projected/128ae2e6-72c5-44ef-bf0a-6f54d80796cd-kube-api-access-qsscl\") on node \"crc\" DevicePath \"\"" Oct 03 14:57:02 crc kubenswrapper[4774]: I1003 14:57:02.097454 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh" event={"ID":"128ae2e6-72c5-44ef-bf0a-6f54d80796cd","Type":"ContainerDied","Data":"5285ae03e4d3c98a478719a7041000ad6757a6e3ff4cdac81af85f8ac4cded26"} Oct 03 14:57:02 crc kubenswrapper[4774]: I1003 14:57:02.097511 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5285ae03e4d3c98a478719a7041000ad6757a6e3ff4cdac81af85f8ac4cded26" Oct 03 14:57:02 crc kubenswrapper[4774]: I1003 14:57:02.097540 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh" Oct 03 14:57:03 crc kubenswrapper[4774]: I1003 14:57:03.021476 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mrxjm"] Oct 03 14:57:03 crc kubenswrapper[4774]: E1003 14:57:03.022352 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128ae2e6-72c5-44ef-bf0a-6f54d80796cd" containerName="util" Oct 03 14:57:03 crc kubenswrapper[4774]: I1003 14:57:03.022417 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="128ae2e6-72c5-44ef-bf0a-6f54d80796cd" containerName="util" Oct 03 14:57:03 crc kubenswrapper[4774]: E1003 14:57:03.022449 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128ae2e6-72c5-44ef-bf0a-6f54d80796cd" containerName="extract" Oct 03 14:57:03 crc kubenswrapper[4774]: I1003 14:57:03.022469 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="128ae2e6-72c5-44ef-bf0a-6f54d80796cd" containerName="extract" Oct 03 14:57:03 crc kubenswrapper[4774]: E1003 14:57:03.022490 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128ae2e6-72c5-44ef-bf0a-6f54d80796cd" containerName="pull" Oct 03 14:57:03 crc kubenswrapper[4774]: I1003 14:57:03.022507 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="128ae2e6-72c5-44ef-bf0a-6f54d80796cd" containerName="pull" Oct 03 14:57:03 crc kubenswrapper[4774]: I1003 14:57:03.022748 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="128ae2e6-72c5-44ef-bf0a-6f54d80796cd" containerName="extract" Oct 03 14:57:03 crc kubenswrapper[4774]: I1003 14:57:03.024311 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mrxjm" Oct 03 14:57:03 crc kubenswrapper[4774]: I1003 14:57:03.045895 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mrxjm"] Oct 03 14:57:03 crc kubenswrapper[4774]: I1003 14:57:03.101989 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4da4144a-7e0c-4733-b462-1c4887e0f515-utilities\") pod \"community-operators-mrxjm\" (UID: \"4da4144a-7e0c-4733-b462-1c4887e0f515\") " pod="openshift-marketplace/community-operators-mrxjm" Oct 03 14:57:03 crc kubenswrapper[4774]: I1003 14:57:03.102047 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4da4144a-7e0c-4733-b462-1c4887e0f515-catalog-content\") pod \"community-operators-mrxjm\" (UID: \"4da4144a-7e0c-4733-b462-1c4887e0f515\") " pod="openshift-marketplace/community-operators-mrxjm" Oct 03 14:57:03 crc kubenswrapper[4774]: I1003 14:57:03.102132 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-662dt\" (UniqueName: \"kubernetes.io/projected/4da4144a-7e0c-4733-b462-1c4887e0f515-kube-api-access-662dt\") pod \"community-operators-mrxjm\" (UID: \"4da4144a-7e0c-4733-b462-1c4887e0f515\") " pod="openshift-marketplace/community-operators-mrxjm" Oct 03 14:57:03 crc kubenswrapper[4774]: I1003 14:57:03.203666 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4da4144a-7e0c-4733-b462-1c4887e0f515-utilities\") pod \"community-operators-mrxjm\" (UID: \"4da4144a-7e0c-4733-b462-1c4887e0f515\") " pod="openshift-marketplace/community-operators-mrxjm" Oct 03 14:57:03 crc kubenswrapper[4774]: I1003 14:57:03.203736 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4da4144a-7e0c-4733-b462-1c4887e0f515-catalog-content\") pod \"community-operators-mrxjm\" (UID: \"4da4144a-7e0c-4733-b462-1c4887e0f515\") " pod="openshift-marketplace/community-operators-mrxjm" Oct 03 14:57:03 crc kubenswrapper[4774]: I1003 14:57:03.204275 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4da4144a-7e0c-4733-b462-1c4887e0f515-catalog-content\") pod \"community-operators-mrxjm\" (UID: \"4da4144a-7e0c-4733-b462-1c4887e0f515\") " pod="openshift-marketplace/community-operators-mrxjm" Oct 03 14:57:03 crc kubenswrapper[4774]: I1003 14:57:03.204294 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4da4144a-7e0c-4733-b462-1c4887e0f515-utilities\") pod \"community-operators-mrxjm\" (UID: \"4da4144a-7e0c-4733-b462-1c4887e0f515\") " pod="openshift-marketplace/community-operators-mrxjm" Oct 03 14:57:03 crc kubenswrapper[4774]: I1003 14:57:03.204311 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-662dt\" (UniqueName: \"kubernetes.io/projected/4da4144a-7e0c-4733-b462-1c4887e0f515-kube-api-access-662dt\") pod \"community-operators-mrxjm\" (UID: \"4da4144a-7e0c-4733-b462-1c4887e0f515\") " pod="openshift-marketplace/community-operators-mrxjm" Oct 03 14:57:03 crc kubenswrapper[4774]: I1003 14:57:03.224625 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-662dt\" (UniqueName: \"kubernetes.io/projected/4da4144a-7e0c-4733-b462-1c4887e0f515-kube-api-access-662dt\") pod \"community-operators-mrxjm\" (UID: \"4da4144a-7e0c-4733-b462-1c4887e0f515\") " pod="openshift-marketplace/community-operators-mrxjm" Oct 03 14:57:03 crc kubenswrapper[4774]: I1003 14:57:03.361941 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mrxjm" Oct 03 14:57:03 crc kubenswrapper[4774]: I1003 14:57:03.814576 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mrxjm"] Oct 03 14:57:04 crc kubenswrapper[4774]: I1003 14:57:04.114497 4774 generic.go:334] "Generic (PLEG): container finished" podID="4da4144a-7e0c-4733-b462-1c4887e0f515" containerID="9dad5eb30e3fff6d9ddcf1c47b4cc0018140684af175e3ced7296b8160d5447a" exitCode=0 Oct 03 14:57:04 crc kubenswrapper[4774]: I1003 14:57:04.114576 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mrxjm" event={"ID":"4da4144a-7e0c-4733-b462-1c4887e0f515","Type":"ContainerDied","Data":"9dad5eb30e3fff6d9ddcf1c47b4cc0018140684af175e3ced7296b8160d5447a"} Oct 03 14:57:04 crc kubenswrapper[4774]: I1003 14:57:04.115400 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mrxjm" event={"ID":"4da4144a-7e0c-4733-b462-1c4887e0f515","Type":"ContainerStarted","Data":"4c394fef6df27925387d3f7d9180b844bf72b4084ae7668a129737076b4cb425"} Oct 03 14:57:05 crc kubenswrapper[4774]: I1003 14:57:05.126539 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mrxjm" event={"ID":"4da4144a-7e0c-4733-b462-1c4887e0f515","Type":"ContainerStarted","Data":"cdc154dcc10f7a632ea0fdd2b20514b961d38e778c3153509513dfb39465fb2d"} Oct 03 14:57:06 crc kubenswrapper[4774]: I1003 14:57:06.138298 4774 generic.go:334] "Generic (PLEG): container finished" podID="4da4144a-7e0c-4733-b462-1c4887e0f515" containerID="cdc154dcc10f7a632ea0fdd2b20514b961d38e778c3153509513dfb39465fb2d" exitCode=0 Oct 03 14:57:06 crc kubenswrapper[4774]: I1003 14:57:06.138409 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mrxjm" event={"ID":"4da4144a-7e0c-4733-b462-1c4887e0f515","Type":"ContainerDied","Data":"cdc154dcc10f7a632ea0fdd2b20514b961d38e778c3153509513dfb39465fb2d"} Oct 03 14:57:06 crc kubenswrapper[4774]: I1003 14:57:06.194419 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9hqmr"] Oct 03 14:57:06 crc kubenswrapper[4774]: I1003 14:57:06.197716 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9hqmr" Oct 03 14:57:06 crc kubenswrapper[4774]: I1003 14:57:06.208058 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9hqmr"] Oct 03 14:57:06 crc kubenswrapper[4774]: I1003 14:57:06.244103 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cb629a3-abef-436c-a44d-7b51d921d68e-utilities\") pod \"redhat-marketplace-9hqmr\" (UID: \"4cb629a3-abef-436c-a44d-7b51d921d68e\") " pod="openshift-marketplace/redhat-marketplace-9hqmr" Oct 03 14:57:06 crc kubenswrapper[4774]: I1003 14:57:06.244184 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cb629a3-abef-436c-a44d-7b51d921d68e-catalog-content\") pod \"redhat-marketplace-9hqmr\" (UID: \"4cb629a3-abef-436c-a44d-7b51d921d68e\") " pod="openshift-marketplace/redhat-marketplace-9hqmr" Oct 03 14:57:06 crc kubenswrapper[4774]: I1003 14:57:06.244349 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47kp2\" (UniqueName: \"kubernetes.io/projected/4cb629a3-abef-436c-a44d-7b51d921d68e-kube-api-access-47kp2\") pod \"redhat-marketplace-9hqmr\" (UID: \"4cb629a3-abef-436c-a44d-7b51d921d68e\") " pod="openshift-marketplace/redhat-marketplace-9hqmr" Oct 03 14:57:06 crc kubenswrapper[4774]: I1003 14:57:06.346119 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cb629a3-abef-436c-a44d-7b51d921d68e-utilities\") pod \"redhat-marketplace-9hqmr\" (UID: \"4cb629a3-abef-436c-a44d-7b51d921d68e\") " pod="openshift-marketplace/redhat-marketplace-9hqmr" Oct 03 14:57:06 crc kubenswrapper[4774]: I1003 14:57:06.346172 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cb629a3-abef-436c-a44d-7b51d921d68e-catalog-content\") pod \"redhat-marketplace-9hqmr\" (UID: \"4cb629a3-abef-436c-a44d-7b51d921d68e\") " pod="openshift-marketplace/redhat-marketplace-9hqmr" Oct 03 14:57:06 crc kubenswrapper[4774]: I1003 14:57:06.346235 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47kp2\" (UniqueName: \"kubernetes.io/projected/4cb629a3-abef-436c-a44d-7b51d921d68e-kube-api-access-47kp2\") pod \"redhat-marketplace-9hqmr\" (UID: \"4cb629a3-abef-436c-a44d-7b51d921d68e\") " pod="openshift-marketplace/redhat-marketplace-9hqmr" Oct 03 14:57:06 crc kubenswrapper[4774]: I1003 14:57:06.346882 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cb629a3-abef-436c-a44d-7b51d921d68e-utilities\") pod \"redhat-marketplace-9hqmr\" (UID: \"4cb629a3-abef-436c-a44d-7b51d921d68e\") " pod="openshift-marketplace/redhat-marketplace-9hqmr" Oct 03 14:57:06 crc kubenswrapper[4774]: I1003 14:57:06.346916 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cb629a3-abef-436c-a44d-7b51d921d68e-catalog-content\") pod \"redhat-marketplace-9hqmr\" (UID: \"4cb629a3-abef-436c-a44d-7b51d921d68e\") " pod="openshift-marketplace/redhat-marketplace-9hqmr" Oct 03 14:57:06 crc kubenswrapper[4774]: I1003 14:57:06.372441 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47kp2\" (UniqueName: \"kubernetes.io/projected/4cb629a3-abef-436c-a44d-7b51d921d68e-kube-api-access-47kp2\") pod \"redhat-marketplace-9hqmr\" (UID: \"4cb629a3-abef-436c-a44d-7b51d921d68e\") " pod="openshift-marketplace/redhat-marketplace-9hqmr" Oct 03 14:57:06 crc kubenswrapper[4774]: I1003 14:57:06.525151 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9hqmr" Oct 03 14:57:06 crc kubenswrapper[4774]: I1003 14:57:06.991209 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9hqmr"] Oct 03 14:57:07 crc kubenswrapper[4774]: I1003 14:57:07.147003 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mrxjm" event={"ID":"4da4144a-7e0c-4733-b462-1c4887e0f515","Type":"ContainerStarted","Data":"331dae409616f776f8d1ff1e9aca73902cd6b52fe7cad8c16117c78c53938ff9"} Oct 03 14:57:07 crc kubenswrapper[4774]: I1003 14:57:07.148337 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9hqmr" event={"ID":"4cb629a3-abef-436c-a44d-7b51d921d68e","Type":"ContainerStarted","Data":"caa9a7ea4820f7ceac3863a7cbdc2b667814de90d082911b621928fa68f765cb"} Oct 03 14:57:07 crc kubenswrapper[4774]: I1003 14:57:07.148406 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9hqmr" event={"ID":"4cb629a3-abef-436c-a44d-7b51d921d68e","Type":"ContainerStarted","Data":"c7687d6338932133e307ff8244ad69801fd9aff226b0b1db845638291ddfd093"} Oct 03 14:57:07 crc kubenswrapper[4774]: I1003 14:57:07.166052 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mrxjm" podStartSLOduration=1.6766030010000001 podStartE2EDuration="4.166032009s" podCreationTimestamp="2025-10-03 14:57:03 +0000 UTC" firstStartedPulling="2025-10-03 14:57:04.116832657 +0000 UTC m=+846.706036119" lastFinishedPulling="2025-10-03 14:57:06.606261675 +0000 UTC m=+849.195465127" observedRunningTime="2025-10-03 14:57:07.162013069 +0000 UTC m=+849.751216521" watchObservedRunningTime="2025-10-03 14:57:07.166032009 +0000 UTC m=+849.755235461" Oct 03 14:57:08 crc kubenswrapper[4774]: I1003 14:57:08.155527 4774 generic.go:334] "Generic (PLEG): container finished" podID="4cb629a3-abef-436c-a44d-7b51d921d68e" containerID="caa9a7ea4820f7ceac3863a7cbdc2b667814de90d082911b621928fa68f765cb" exitCode=0 Oct 03 14:57:08 crc kubenswrapper[4774]: I1003 14:57:08.155585 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9hqmr" event={"ID":"4cb629a3-abef-436c-a44d-7b51d921d68e","Type":"ContainerDied","Data":"caa9a7ea4820f7ceac3863a7cbdc2b667814de90d082911b621928fa68f765cb"} Oct 03 14:57:08 crc kubenswrapper[4774]: I1003 14:57:08.625074 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7c89b76849-9klgg"] Oct 03 14:57:08 crc kubenswrapper[4774]: I1003 14:57:08.626489 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7c89b76849-9klgg" Oct 03 14:57:08 crc kubenswrapper[4774]: I1003 14:57:08.628130 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-xfmgk" Oct 03 14:57:08 crc kubenswrapper[4774]: I1003 14:57:08.678750 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znqp2\" (UniqueName: \"kubernetes.io/projected/4ad90ede-158d-4798-a2d5-399d61654604-kube-api-access-znqp2\") pod \"openstack-operator-controller-operator-7c89b76849-9klgg\" (UID: \"4ad90ede-158d-4798-a2d5-399d61654604\") " pod="openstack-operators/openstack-operator-controller-operator-7c89b76849-9klgg" Oct 03 14:57:08 crc kubenswrapper[4774]: I1003 14:57:08.705973 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7c89b76849-9klgg"] Oct 03 14:57:08 crc kubenswrapper[4774]: I1003 14:57:08.780014 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znqp2\" (UniqueName: \"kubernetes.io/projected/4ad90ede-158d-4798-a2d5-399d61654604-kube-api-access-znqp2\") pod \"openstack-operator-controller-operator-7c89b76849-9klgg\" (UID: \"4ad90ede-158d-4798-a2d5-399d61654604\") " pod="openstack-operators/openstack-operator-controller-operator-7c89b76849-9klgg" Oct 03 14:57:08 crc kubenswrapper[4774]: I1003 14:57:08.801066 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znqp2\" (UniqueName: \"kubernetes.io/projected/4ad90ede-158d-4798-a2d5-399d61654604-kube-api-access-znqp2\") pod \"openstack-operator-controller-operator-7c89b76849-9klgg\" (UID: \"4ad90ede-158d-4798-a2d5-399d61654604\") " pod="openstack-operators/openstack-operator-controller-operator-7c89b76849-9klgg" Oct 03 14:57:08 crc kubenswrapper[4774]: I1003 14:57:08.952525 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7c89b76849-9klgg" Oct 03 14:57:09 crc kubenswrapper[4774]: I1003 14:57:09.171888 4774 generic.go:334] "Generic (PLEG): container finished" podID="4cb629a3-abef-436c-a44d-7b51d921d68e" containerID="62d696bd0a919bba73843a87fbb33cdff2b10e172ab60e95d23ed8d577980d76" exitCode=0 Oct 03 14:57:09 crc kubenswrapper[4774]: I1003 14:57:09.171927 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9hqmr" event={"ID":"4cb629a3-abef-436c-a44d-7b51d921d68e","Type":"ContainerDied","Data":"62d696bd0a919bba73843a87fbb33cdff2b10e172ab60e95d23ed8d577980d76"} Oct 03 14:57:09 crc kubenswrapper[4774]: I1003 14:57:09.421744 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7c89b76849-9klgg"] Oct 03 14:57:09 crc kubenswrapper[4774]: W1003 14:57:09.441200 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ad90ede_158d_4798_a2d5_399d61654604.slice/crio-cbcc178cef848a52e01444747045e215430133421352702a5a530436eb37a1b1 WatchSource:0}: Error finding container cbcc178cef848a52e01444747045e215430133421352702a5a530436eb37a1b1: Status 404 returned error can't find the container with id cbcc178cef848a52e01444747045e215430133421352702a5a530436eb37a1b1 Oct 03 14:57:10 crc kubenswrapper[4774]: I1003 14:57:10.189160 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7c89b76849-9klgg" event={"ID":"4ad90ede-158d-4798-a2d5-399d61654604","Type":"ContainerStarted","Data":"cbcc178cef848a52e01444747045e215430133421352702a5a530436eb37a1b1"} Oct 03 14:57:10 crc kubenswrapper[4774]: I1003 14:57:10.196073 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9hqmr" event={"ID":"4cb629a3-abef-436c-a44d-7b51d921d68e","Type":"ContainerStarted","Data":"b79b77d5d76d307ace60d3e1a3d57564bc3733c5b88aa4997d99e50d14e20ea0"} Oct 03 14:57:12 crc kubenswrapper[4774]: I1003 14:57:12.592545 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9hqmr" podStartSLOduration=5.196251991 podStartE2EDuration="6.592526805s" podCreationTimestamp="2025-10-03 14:57:06 +0000 UTC" firstStartedPulling="2025-10-03 14:57:08.157042458 +0000 UTC m=+850.746245910" lastFinishedPulling="2025-10-03 14:57:09.553317272 +0000 UTC m=+852.142520724" observedRunningTime="2025-10-03 14:57:10.211917459 +0000 UTC m=+852.801120921" watchObservedRunningTime="2025-10-03 14:57:12.592526805 +0000 UTC m=+855.181730257" Oct 03 14:57:12 crc kubenswrapper[4774]: I1003 14:57:12.593558 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qjrm4"] Oct 03 14:57:12 crc kubenswrapper[4774]: I1003 14:57:12.594901 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qjrm4" Oct 03 14:57:12 crc kubenswrapper[4774]: I1003 14:57:12.604419 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qjrm4"] Oct 03 14:57:12 crc kubenswrapper[4774]: I1003 14:57:12.653216 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8x6d\" (UniqueName: \"kubernetes.io/projected/97c881ca-6fb5-4f44-bf10-927f21fec469-kube-api-access-c8x6d\") pod \"certified-operators-qjrm4\" (UID: \"97c881ca-6fb5-4f44-bf10-927f21fec469\") " pod="openshift-marketplace/certified-operators-qjrm4" Oct 03 14:57:12 crc kubenswrapper[4774]: I1003 14:57:12.653273 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97c881ca-6fb5-4f44-bf10-927f21fec469-catalog-content\") pod \"certified-operators-qjrm4\" (UID: \"97c881ca-6fb5-4f44-bf10-927f21fec469\") " pod="openshift-marketplace/certified-operators-qjrm4" Oct 03 14:57:12 crc kubenswrapper[4774]: I1003 14:57:12.653322 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97c881ca-6fb5-4f44-bf10-927f21fec469-utilities\") pod \"certified-operators-qjrm4\" (UID: \"97c881ca-6fb5-4f44-bf10-927f21fec469\") " pod="openshift-marketplace/certified-operators-qjrm4" Oct 03 14:57:12 crc kubenswrapper[4774]: I1003 14:57:12.754822 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8x6d\" (UniqueName: \"kubernetes.io/projected/97c881ca-6fb5-4f44-bf10-927f21fec469-kube-api-access-c8x6d\") pod \"certified-operators-qjrm4\" (UID: \"97c881ca-6fb5-4f44-bf10-927f21fec469\") " pod="openshift-marketplace/certified-operators-qjrm4" Oct 03 14:57:12 crc kubenswrapper[4774]: I1003 14:57:12.754900 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97c881ca-6fb5-4f44-bf10-927f21fec469-catalog-content\") pod \"certified-operators-qjrm4\" (UID: \"97c881ca-6fb5-4f44-bf10-927f21fec469\") " pod="openshift-marketplace/certified-operators-qjrm4" Oct 03 14:57:12 crc kubenswrapper[4774]: I1003 14:57:12.754973 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97c881ca-6fb5-4f44-bf10-927f21fec469-utilities\") pod \"certified-operators-qjrm4\" (UID: \"97c881ca-6fb5-4f44-bf10-927f21fec469\") " pod="openshift-marketplace/certified-operators-qjrm4" Oct 03 14:57:12 crc kubenswrapper[4774]: I1003 14:57:12.755626 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97c881ca-6fb5-4f44-bf10-927f21fec469-utilities\") pod \"certified-operators-qjrm4\" (UID: \"97c881ca-6fb5-4f44-bf10-927f21fec469\") " pod="openshift-marketplace/certified-operators-qjrm4" Oct 03 14:57:12 crc kubenswrapper[4774]: I1003 14:57:12.755723 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97c881ca-6fb5-4f44-bf10-927f21fec469-catalog-content\") pod \"certified-operators-qjrm4\" (UID: \"97c881ca-6fb5-4f44-bf10-927f21fec469\") " pod="openshift-marketplace/certified-operators-qjrm4" Oct 03 14:57:12 crc kubenswrapper[4774]: I1003 14:57:12.772878 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8x6d\" (UniqueName: \"kubernetes.io/projected/97c881ca-6fb5-4f44-bf10-927f21fec469-kube-api-access-c8x6d\") pod \"certified-operators-qjrm4\" (UID: \"97c881ca-6fb5-4f44-bf10-927f21fec469\") " pod="openshift-marketplace/certified-operators-qjrm4" Oct 03 14:57:12 crc kubenswrapper[4774]: I1003 14:57:12.920353 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qjrm4" Oct 03 14:57:13 crc kubenswrapper[4774]: I1003 14:57:13.362749 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mrxjm" Oct 03 14:57:13 crc kubenswrapper[4774]: I1003 14:57:13.363134 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mrxjm" Oct 03 14:57:13 crc kubenswrapper[4774]: I1003 14:57:13.407247 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mrxjm" Oct 03 14:57:13 crc kubenswrapper[4774]: I1003 14:57:13.746633 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qjrm4"] Oct 03 14:57:13 crc kubenswrapper[4774]: W1003 14:57:13.747401 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97c881ca_6fb5_4f44_bf10_927f21fec469.slice/crio-7e912406d692867b3fa4be12601aae00bd841a2d5f5c82fc557a28fd90e3e8c3 WatchSource:0}: Error finding container 7e912406d692867b3fa4be12601aae00bd841a2d5f5c82fc557a28fd90e3e8c3: Status 404 returned error can't find the container with id 7e912406d692867b3fa4be12601aae00bd841a2d5f5c82fc557a28fd90e3e8c3 Oct 03 14:57:14 crc kubenswrapper[4774]: I1003 14:57:14.224914 4774 generic.go:334] "Generic (PLEG): container finished" podID="97c881ca-6fb5-4f44-bf10-927f21fec469" containerID="2bfeb9dc80edf7a0a764d0f24e70b4fa83118ad0f5d82f609359f3e2d7516c96" exitCode=0 Oct 03 14:57:14 crc kubenswrapper[4774]: I1003 14:57:14.225095 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjrm4" event={"ID":"97c881ca-6fb5-4f44-bf10-927f21fec469","Type":"ContainerDied","Data":"2bfeb9dc80edf7a0a764d0f24e70b4fa83118ad0f5d82f609359f3e2d7516c96"} Oct 03 14:57:14 crc kubenswrapper[4774]: I1003 14:57:14.225313 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjrm4" event={"ID":"97c881ca-6fb5-4f44-bf10-927f21fec469","Type":"ContainerStarted","Data":"7e912406d692867b3fa4be12601aae00bd841a2d5f5c82fc557a28fd90e3e8c3"} Oct 03 14:57:14 crc kubenswrapper[4774]: I1003 14:57:14.245594 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7c89b76849-9klgg" event={"ID":"4ad90ede-158d-4798-a2d5-399d61654604","Type":"ContainerStarted","Data":"8de5bed602ec7f8452d93686ed26ab8fd3818a4158a4081f52ee25d10fce9153"} Oct 03 14:57:14 crc kubenswrapper[4774]: I1003 14:57:14.286824 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mrxjm" Oct 03 14:57:15 crc kubenswrapper[4774]: I1003 14:57:15.984098 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mrxjm"] Oct 03 14:57:16 crc kubenswrapper[4774]: I1003 14:57:16.262486 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mrxjm" podUID="4da4144a-7e0c-4733-b462-1c4887e0f515" containerName="registry-server" containerID="cri-o://331dae409616f776f8d1ff1e9aca73902cd6b52fe7cad8c16117c78c53938ff9" gracePeriod=2 Oct 03 14:57:16 crc kubenswrapper[4774]: I1003 14:57:16.527070 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9hqmr" Oct 03 14:57:16 crc kubenswrapper[4774]: I1003 14:57:16.528017 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9hqmr" Oct 03 14:57:16 crc kubenswrapper[4774]: I1003 14:57:16.580739 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9hqmr" Oct 03 14:57:16 crc kubenswrapper[4774]: I1003 14:57:16.645772 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mrxjm" Oct 03 14:57:16 crc kubenswrapper[4774]: I1003 14:57:16.704994 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4da4144a-7e0c-4733-b462-1c4887e0f515-utilities\") pod \"4da4144a-7e0c-4733-b462-1c4887e0f515\" (UID: \"4da4144a-7e0c-4733-b462-1c4887e0f515\") " Oct 03 14:57:16 crc kubenswrapper[4774]: I1003 14:57:16.705059 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-662dt\" (UniqueName: \"kubernetes.io/projected/4da4144a-7e0c-4733-b462-1c4887e0f515-kube-api-access-662dt\") pod \"4da4144a-7e0c-4733-b462-1c4887e0f515\" (UID: \"4da4144a-7e0c-4733-b462-1c4887e0f515\") " Oct 03 14:57:16 crc kubenswrapper[4774]: I1003 14:57:16.705095 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4da4144a-7e0c-4733-b462-1c4887e0f515-catalog-content\") pod \"4da4144a-7e0c-4733-b462-1c4887e0f515\" (UID: \"4da4144a-7e0c-4733-b462-1c4887e0f515\") " Oct 03 14:57:16 crc kubenswrapper[4774]: I1003 14:57:16.706096 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4da4144a-7e0c-4733-b462-1c4887e0f515-utilities" (OuterVolumeSpecName: "utilities") pod "4da4144a-7e0c-4733-b462-1c4887e0f515" (UID: "4da4144a-7e0c-4733-b462-1c4887e0f515"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:57:16 crc kubenswrapper[4774]: I1003 14:57:16.715537 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4da4144a-7e0c-4733-b462-1c4887e0f515-kube-api-access-662dt" (OuterVolumeSpecName: "kube-api-access-662dt") pod "4da4144a-7e0c-4733-b462-1c4887e0f515" (UID: "4da4144a-7e0c-4733-b462-1c4887e0f515"). InnerVolumeSpecName "kube-api-access-662dt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:57:16 crc kubenswrapper[4774]: I1003 14:57:16.751834 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4da4144a-7e0c-4733-b462-1c4887e0f515-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4da4144a-7e0c-4733-b462-1c4887e0f515" (UID: "4da4144a-7e0c-4733-b462-1c4887e0f515"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:57:16 crc kubenswrapper[4774]: I1003 14:57:16.806862 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-662dt\" (UniqueName: \"kubernetes.io/projected/4da4144a-7e0c-4733-b462-1c4887e0f515-kube-api-access-662dt\") on node \"crc\" DevicePath \"\"" Oct 03 14:57:16 crc kubenswrapper[4774]: I1003 14:57:16.806889 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4da4144a-7e0c-4733-b462-1c4887e0f515-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:57:16 crc kubenswrapper[4774]: I1003 14:57:16.806898 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4da4144a-7e0c-4733-b462-1c4887e0f515-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:57:17 crc kubenswrapper[4774]: I1003 14:57:17.273716 4774 generic.go:334] "Generic (PLEG): container finished" podID="97c881ca-6fb5-4f44-bf10-927f21fec469" containerID="5433900289c8a5c4069f2aa77b292558b1a64b0d3f7d834320cefd41c8facaba" exitCode=0 Oct 03 14:57:17 crc kubenswrapper[4774]: I1003 14:57:17.273786 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjrm4" event={"ID":"97c881ca-6fb5-4f44-bf10-927f21fec469","Type":"ContainerDied","Data":"5433900289c8a5c4069f2aa77b292558b1a64b0d3f7d834320cefd41c8facaba"} Oct 03 14:57:17 crc kubenswrapper[4774]: I1003 14:57:17.277572 4774 generic.go:334] "Generic (PLEG): container finished" podID="4da4144a-7e0c-4733-b462-1c4887e0f515" containerID="331dae409616f776f8d1ff1e9aca73902cd6b52fe7cad8c16117c78c53938ff9" exitCode=0 Oct 03 14:57:17 crc kubenswrapper[4774]: I1003 14:57:17.277620 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mrxjm" event={"ID":"4da4144a-7e0c-4733-b462-1c4887e0f515","Type":"ContainerDied","Data":"331dae409616f776f8d1ff1e9aca73902cd6b52fe7cad8c16117c78c53938ff9"} Oct 03 14:57:17 crc kubenswrapper[4774]: I1003 14:57:17.277641 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mrxjm" event={"ID":"4da4144a-7e0c-4733-b462-1c4887e0f515","Type":"ContainerDied","Data":"4c394fef6df27925387d3f7d9180b844bf72b4084ae7668a129737076b4cb425"} Oct 03 14:57:17 crc kubenswrapper[4774]: I1003 14:57:17.277659 4774 scope.go:117] "RemoveContainer" containerID="331dae409616f776f8d1ff1e9aca73902cd6b52fe7cad8c16117c78c53938ff9" Oct 03 14:57:17 crc kubenswrapper[4774]: I1003 14:57:17.277783 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mrxjm" Oct 03 14:57:17 crc kubenswrapper[4774]: I1003 14:57:17.285474 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7c89b76849-9klgg" event={"ID":"4ad90ede-158d-4798-a2d5-399d61654604","Type":"ContainerStarted","Data":"75474fba9531caa5111f42f10a996160452617f6ceab476af8ef1b5fd3e61d62"} Oct 03 14:57:17 crc kubenswrapper[4774]: I1003 14:57:17.319992 4774 scope.go:117] "RemoveContainer" containerID="cdc154dcc10f7a632ea0fdd2b20514b961d38e778c3153509513dfb39465fb2d" Oct 03 14:57:17 crc kubenswrapper[4774]: I1003 14:57:17.359259 4774 scope.go:117] "RemoveContainer" containerID="9dad5eb30e3fff6d9ddcf1c47b4cc0018140684af175e3ced7296b8160d5447a" Oct 03 14:57:17 crc kubenswrapper[4774]: I1003 14:57:17.365856 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-7c89b76849-9klgg" podStartSLOduration=2.636959698 podStartE2EDuration="9.365835668s" podCreationTimestamp="2025-10-03 14:57:08 +0000 UTC" firstStartedPulling="2025-10-03 14:57:09.444637285 +0000 UTC m=+852.033840737" lastFinishedPulling="2025-10-03 14:57:16.173513235 +0000 UTC m=+858.762716707" observedRunningTime="2025-10-03 14:57:17.359234544 +0000 UTC m=+859.948438046" watchObservedRunningTime="2025-10-03 14:57:17.365835668 +0000 UTC m=+859.955039120" Oct 03 14:57:17 crc kubenswrapper[4774]: I1003 14:57:17.372760 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9hqmr" Oct 03 14:57:17 crc kubenswrapper[4774]: I1003 14:57:17.383460 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mrxjm"] Oct 03 14:57:17 crc kubenswrapper[4774]: I1003 14:57:17.386570 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mrxjm"] Oct 03 14:57:17 crc kubenswrapper[4774]: I1003 14:57:17.387350 4774 scope.go:117] "RemoveContainer" containerID="331dae409616f776f8d1ff1e9aca73902cd6b52fe7cad8c16117c78c53938ff9" Oct 03 14:57:17 crc kubenswrapper[4774]: E1003 14:57:17.387884 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"331dae409616f776f8d1ff1e9aca73902cd6b52fe7cad8c16117c78c53938ff9\": container with ID starting with 331dae409616f776f8d1ff1e9aca73902cd6b52fe7cad8c16117c78c53938ff9 not found: ID does not exist" containerID="331dae409616f776f8d1ff1e9aca73902cd6b52fe7cad8c16117c78c53938ff9" Oct 03 14:57:17 crc kubenswrapper[4774]: I1003 14:57:17.387912 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"331dae409616f776f8d1ff1e9aca73902cd6b52fe7cad8c16117c78c53938ff9"} err="failed to get container status \"331dae409616f776f8d1ff1e9aca73902cd6b52fe7cad8c16117c78c53938ff9\": rpc error: code = NotFound desc = could not find container \"331dae409616f776f8d1ff1e9aca73902cd6b52fe7cad8c16117c78c53938ff9\": container with ID starting with 331dae409616f776f8d1ff1e9aca73902cd6b52fe7cad8c16117c78c53938ff9 not found: ID does not exist" Oct 03 14:57:17 crc kubenswrapper[4774]: I1003 14:57:17.387934 4774 scope.go:117] "RemoveContainer" containerID="cdc154dcc10f7a632ea0fdd2b20514b961d38e778c3153509513dfb39465fb2d" Oct 03 14:57:17 crc kubenswrapper[4774]: E1003 14:57:17.388286 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdc154dcc10f7a632ea0fdd2b20514b961d38e778c3153509513dfb39465fb2d\": container with ID starting with cdc154dcc10f7a632ea0fdd2b20514b961d38e778c3153509513dfb39465fb2d not found: ID does not exist" containerID="cdc154dcc10f7a632ea0fdd2b20514b961d38e778c3153509513dfb39465fb2d" Oct 03 14:57:17 crc kubenswrapper[4774]: I1003 14:57:17.388354 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdc154dcc10f7a632ea0fdd2b20514b961d38e778c3153509513dfb39465fb2d"} err="failed to get container status \"cdc154dcc10f7a632ea0fdd2b20514b961d38e778c3153509513dfb39465fb2d\": rpc error: code = NotFound desc = could not find container \"cdc154dcc10f7a632ea0fdd2b20514b961d38e778c3153509513dfb39465fb2d\": container with ID starting with cdc154dcc10f7a632ea0fdd2b20514b961d38e778c3153509513dfb39465fb2d not found: ID does not exist" Oct 03 14:57:17 crc kubenswrapper[4774]: I1003 14:57:17.388424 4774 scope.go:117] "RemoveContainer" containerID="9dad5eb30e3fff6d9ddcf1c47b4cc0018140684af175e3ced7296b8160d5447a" Oct 03 14:57:17 crc kubenswrapper[4774]: E1003 14:57:17.388764 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dad5eb30e3fff6d9ddcf1c47b4cc0018140684af175e3ced7296b8160d5447a\": container with ID starting with 9dad5eb30e3fff6d9ddcf1c47b4cc0018140684af175e3ced7296b8160d5447a not found: ID does not exist" containerID="9dad5eb30e3fff6d9ddcf1c47b4cc0018140684af175e3ced7296b8160d5447a" Oct 03 14:57:17 crc kubenswrapper[4774]: I1003 14:57:17.388788 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dad5eb30e3fff6d9ddcf1c47b4cc0018140684af175e3ced7296b8160d5447a"} err="failed to get container status \"9dad5eb30e3fff6d9ddcf1c47b4cc0018140684af175e3ced7296b8160d5447a\": rpc error: code = NotFound desc = could not find container \"9dad5eb30e3fff6d9ddcf1c47b4cc0018140684af175e3ced7296b8160d5447a\": container with ID starting with 9dad5eb30e3fff6d9ddcf1c47b4cc0018140684af175e3ced7296b8160d5447a not found: ID does not exist" Oct 03 14:57:18 crc kubenswrapper[4774]: I1003 14:57:18.296035 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjrm4" event={"ID":"97c881ca-6fb5-4f44-bf10-927f21fec469","Type":"ContainerStarted","Data":"959488775ff4b99265aae411015239d73d9f69af761ea214966e461eae75abf4"} Oct 03 14:57:18 crc kubenswrapper[4774]: I1003 14:57:18.296833 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-7c89b76849-9klgg" Oct 03 14:57:18 crc kubenswrapper[4774]: I1003 14:57:18.300002 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-7c89b76849-9klgg" Oct 03 14:57:18 crc kubenswrapper[4774]: I1003 14:57:18.321647 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qjrm4" podStartSLOduration=2.858587406 podStartE2EDuration="6.321631848s" podCreationTimestamp="2025-10-03 14:57:12 +0000 UTC" firstStartedPulling="2025-10-03 14:57:14.231508285 +0000 UTC m=+856.820711737" lastFinishedPulling="2025-10-03 14:57:17.694552727 +0000 UTC m=+860.283756179" observedRunningTime="2025-10-03 14:57:18.318765647 +0000 UTC m=+860.907969109" watchObservedRunningTime="2025-10-03 14:57:18.321631848 +0000 UTC m=+860.910835300" Oct 03 14:57:19 crc kubenswrapper[4774]: I1003 14:57:19.308219 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4da4144a-7e0c-4733-b462-1c4887e0f515" path="/var/lib/kubelet/pods/4da4144a-7e0c-4733-b462-1c4887e0f515/volumes" Oct 03 14:57:19 crc kubenswrapper[4774]: I1003 14:57:19.593466 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ndcm4"] Oct 03 14:57:19 crc kubenswrapper[4774]: E1003 14:57:19.594143 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4da4144a-7e0c-4733-b462-1c4887e0f515" containerName="extract-content" Oct 03 14:57:19 crc kubenswrapper[4774]: I1003 14:57:19.594187 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="4da4144a-7e0c-4733-b462-1c4887e0f515" containerName="extract-content" Oct 03 14:57:19 crc kubenswrapper[4774]: E1003 14:57:19.594226 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4da4144a-7e0c-4733-b462-1c4887e0f515" containerName="extract-utilities" Oct 03 14:57:19 crc kubenswrapper[4774]: I1003 14:57:19.594243 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="4da4144a-7e0c-4733-b462-1c4887e0f515" containerName="extract-utilities" Oct 03 14:57:19 crc kubenswrapper[4774]: E1003 14:57:19.594265 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4da4144a-7e0c-4733-b462-1c4887e0f515" containerName="registry-server" Oct 03 14:57:19 crc kubenswrapper[4774]: I1003 14:57:19.594281 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="4da4144a-7e0c-4733-b462-1c4887e0f515" containerName="registry-server" Oct 03 14:57:19 crc kubenswrapper[4774]: I1003 14:57:19.595251 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="4da4144a-7e0c-4733-b462-1c4887e0f515" containerName="registry-server" Oct 03 14:57:19 crc kubenswrapper[4774]: I1003 14:57:19.597713 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ndcm4" Oct 03 14:57:19 crc kubenswrapper[4774]: I1003 14:57:19.612701 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ndcm4"] Oct 03 14:57:19 crc kubenswrapper[4774]: I1003 14:57:19.646521 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a49f20fd-6fea-477b-afc6-baf8b0393115-catalog-content\") pod \"redhat-operators-ndcm4\" (UID: \"a49f20fd-6fea-477b-afc6-baf8b0393115\") " pod="openshift-marketplace/redhat-operators-ndcm4" Oct 03 14:57:19 crc kubenswrapper[4774]: I1003 14:57:19.646571 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a49f20fd-6fea-477b-afc6-baf8b0393115-utilities\") pod \"redhat-operators-ndcm4\" (UID: \"a49f20fd-6fea-477b-afc6-baf8b0393115\") " pod="openshift-marketplace/redhat-operators-ndcm4" Oct 03 14:57:19 crc kubenswrapper[4774]: I1003 14:57:19.646608 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjqvg\" (UniqueName: \"kubernetes.io/projected/a49f20fd-6fea-477b-afc6-baf8b0393115-kube-api-access-cjqvg\") pod \"redhat-operators-ndcm4\" (UID: \"a49f20fd-6fea-477b-afc6-baf8b0393115\") " pod="openshift-marketplace/redhat-operators-ndcm4" Oct 03 14:57:19 crc kubenswrapper[4774]: I1003 14:57:19.747865 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a49f20fd-6fea-477b-afc6-baf8b0393115-catalog-content\") pod \"redhat-operators-ndcm4\" (UID: \"a49f20fd-6fea-477b-afc6-baf8b0393115\") " pod="openshift-marketplace/redhat-operators-ndcm4" Oct 03 14:57:19 crc kubenswrapper[4774]: I1003 14:57:19.747916 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a49f20fd-6fea-477b-afc6-baf8b0393115-utilities\") pod \"redhat-operators-ndcm4\" (UID: \"a49f20fd-6fea-477b-afc6-baf8b0393115\") " pod="openshift-marketplace/redhat-operators-ndcm4" Oct 03 14:57:19 crc kubenswrapper[4774]: I1003 14:57:19.747936 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjqvg\" (UniqueName: \"kubernetes.io/projected/a49f20fd-6fea-477b-afc6-baf8b0393115-kube-api-access-cjqvg\") pod \"redhat-operators-ndcm4\" (UID: \"a49f20fd-6fea-477b-afc6-baf8b0393115\") " pod="openshift-marketplace/redhat-operators-ndcm4" Oct 03 14:57:19 crc kubenswrapper[4774]: I1003 14:57:19.748412 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a49f20fd-6fea-477b-afc6-baf8b0393115-catalog-content\") pod \"redhat-operators-ndcm4\" (UID: \"a49f20fd-6fea-477b-afc6-baf8b0393115\") " pod="openshift-marketplace/redhat-operators-ndcm4" Oct 03 14:57:19 crc kubenswrapper[4774]: I1003 14:57:19.748486 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a49f20fd-6fea-477b-afc6-baf8b0393115-utilities\") pod \"redhat-operators-ndcm4\" (UID: \"a49f20fd-6fea-477b-afc6-baf8b0393115\") " pod="openshift-marketplace/redhat-operators-ndcm4" Oct 03 14:57:19 crc kubenswrapper[4774]: I1003 14:57:19.774429 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjqvg\" (UniqueName: \"kubernetes.io/projected/a49f20fd-6fea-477b-afc6-baf8b0393115-kube-api-access-cjqvg\") pod \"redhat-operators-ndcm4\" (UID: \"a49f20fd-6fea-477b-afc6-baf8b0393115\") " pod="openshift-marketplace/redhat-operators-ndcm4" Oct 03 14:57:19 crc kubenswrapper[4774]: I1003 14:57:19.914857 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ndcm4" Oct 03 14:57:20 crc kubenswrapper[4774]: I1003 14:57:20.370916 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ndcm4"] Oct 03 14:57:20 crc kubenswrapper[4774]: W1003 14:57:20.375911 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda49f20fd_6fea_477b_afc6_baf8b0393115.slice/crio-5bba7072f13bab9fa60b5bece6473e6ce89688ea33a3ac4b6b1c4ef626cedc9a WatchSource:0}: Error finding container 5bba7072f13bab9fa60b5bece6473e6ce89688ea33a3ac4b6b1c4ef626cedc9a: Status 404 returned error can't find the container with id 5bba7072f13bab9fa60b5bece6473e6ce89688ea33a3ac4b6b1c4ef626cedc9a Oct 03 14:57:21 crc kubenswrapper[4774]: I1003 14:57:21.350303 4774 generic.go:334] "Generic (PLEG): container finished" podID="a49f20fd-6fea-477b-afc6-baf8b0393115" containerID="afaf9cb06cf2035219781884d833a533977afad847cc08594047f04ff42325cd" exitCode=0 Oct 03 14:57:21 crc kubenswrapper[4774]: I1003 14:57:21.350464 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ndcm4" event={"ID":"a49f20fd-6fea-477b-afc6-baf8b0393115","Type":"ContainerDied","Data":"afaf9cb06cf2035219781884d833a533977afad847cc08594047f04ff42325cd"} Oct 03 14:57:21 crc kubenswrapper[4774]: I1003 14:57:21.350824 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ndcm4" event={"ID":"a49f20fd-6fea-477b-afc6-baf8b0393115","Type":"ContainerStarted","Data":"5bba7072f13bab9fa60b5bece6473e6ce89688ea33a3ac4b6b1c4ef626cedc9a"} Oct 03 14:57:22 crc kubenswrapper[4774]: I1003 14:57:22.921273 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qjrm4" Oct 03 14:57:22 crc kubenswrapper[4774]: I1003 14:57:22.921594 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qjrm4" Oct 03 14:57:22 crc kubenswrapper[4774]: I1003 14:57:22.971841 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qjrm4" Oct 03 14:57:23 crc kubenswrapper[4774]: I1003 14:57:23.184911 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9hqmr"] Oct 03 14:57:23 crc kubenswrapper[4774]: I1003 14:57:23.185250 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9hqmr" podUID="4cb629a3-abef-436c-a44d-7b51d921d68e" containerName="registry-server" containerID="cri-o://b79b77d5d76d307ace60d3e1a3d57564bc3733c5b88aa4997d99e50d14e20ea0" gracePeriod=2 Oct 03 14:57:23 crc kubenswrapper[4774]: I1003 14:57:23.368156 4774 generic.go:334] "Generic (PLEG): container finished" podID="4cb629a3-abef-436c-a44d-7b51d921d68e" containerID="b79b77d5d76d307ace60d3e1a3d57564bc3733c5b88aa4997d99e50d14e20ea0" exitCode=0 Oct 03 14:57:23 crc kubenswrapper[4774]: I1003 14:57:23.368240 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9hqmr" event={"ID":"4cb629a3-abef-436c-a44d-7b51d921d68e","Type":"ContainerDied","Data":"b79b77d5d76d307ace60d3e1a3d57564bc3733c5b88aa4997d99e50d14e20ea0"} Oct 03 14:57:23 crc kubenswrapper[4774]: I1003 14:57:23.371272 4774 generic.go:334] "Generic (PLEG): container finished" podID="a49f20fd-6fea-477b-afc6-baf8b0393115" containerID="9dd08f6ee9af3e6f4e70bfe2d14fc5a80b3343a4315c4399c995d892fd15e5fe" exitCode=0 Oct 03 14:57:23 crc kubenswrapper[4774]: I1003 14:57:23.371404 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ndcm4" event={"ID":"a49f20fd-6fea-477b-afc6-baf8b0393115","Type":"ContainerDied","Data":"9dd08f6ee9af3e6f4e70bfe2d14fc5a80b3343a4315c4399c995d892fd15e5fe"} Oct 03 14:57:23 crc kubenswrapper[4774]: I1003 14:57:23.425398 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qjrm4" Oct 03 14:57:23 crc kubenswrapper[4774]: I1003 14:57:23.655305 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9hqmr" Oct 03 14:57:23 crc kubenswrapper[4774]: I1003 14:57:23.702229 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47kp2\" (UniqueName: \"kubernetes.io/projected/4cb629a3-abef-436c-a44d-7b51d921d68e-kube-api-access-47kp2\") pod \"4cb629a3-abef-436c-a44d-7b51d921d68e\" (UID: \"4cb629a3-abef-436c-a44d-7b51d921d68e\") " Oct 03 14:57:23 crc kubenswrapper[4774]: I1003 14:57:23.702318 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cb629a3-abef-436c-a44d-7b51d921d68e-catalog-content\") pod \"4cb629a3-abef-436c-a44d-7b51d921d68e\" (UID: \"4cb629a3-abef-436c-a44d-7b51d921d68e\") " Oct 03 14:57:23 crc kubenswrapper[4774]: I1003 14:57:23.702406 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cb629a3-abef-436c-a44d-7b51d921d68e-utilities\") pod \"4cb629a3-abef-436c-a44d-7b51d921d68e\" (UID: \"4cb629a3-abef-436c-a44d-7b51d921d68e\") " Oct 03 14:57:23 crc kubenswrapper[4774]: I1003 14:57:23.703187 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cb629a3-abef-436c-a44d-7b51d921d68e-utilities" (OuterVolumeSpecName: "utilities") pod "4cb629a3-abef-436c-a44d-7b51d921d68e" (UID: "4cb629a3-abef-436c-a44d-7b51d921d68e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:57:23 crc kubenswrapper[4774]: I1003 14:57:23.707152 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cb629a3-abef-436c-a44d-7b51d921d68e-kube-api-access-47kp2" (OuterVolumeSpecName: "kube-api-access-47kp2") pod "4cb629a3-abef-436c-a44d-7b51d921d68e" (UID: "4cb629a3-abef-436c-a44d-7b51d921d68e"). InnerVolumeSpecName "kube-api-access-47kp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:57:23 crc kubenswrapper[4774]: I1003 14:57:23.714121 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cb629a3-abef-436c-a44d-7b51d921d68e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4cb629a3-abef-436c-a44d-7b51d921d68e" (UID: "4cb629a3-abef-436c-a44d-7b51d921d68e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:57:23 crc kubenswrapper[4774]: I1003 14:57:23.804109 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47kp2\" (UniqueName: \"kubernetes.io/projected/4cb629a3-abef-436c-a44d-7b51d921d68e-kube-api-access-47kp2\") on node \"crc\" DevicePath \"\"" Oct 03 14:57:23 crc kubenswrapper[4774]: I1003 14:57:23.804148 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cb629a3-abef-436c-a44d-7b51d921d68e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:57:23 crc kubenswrapper[4774]: I1003 14:57:23.804161 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cb629a3-abef-436c-a44d-7b51d921d68e-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:57:24 crc kubenswrapper[4774]: I1003 14:57:24.379666 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9hqmr" event={"ID":"4cb629a3-abef-436c-a44d-7b51d921d68e","Type":"ContainerDied","Data":"c7687d6338932133e307ff8244ad69801fd9aff226b0b1db845638291ddfd093"} Oct 03 14:57:24 crc kubenswrapper[4774]: I1003 14:57:24.379734 4774 scope.go:117] "RemoveContainer" containerID="b79b77d5d76d307ace60d3e1a3d57564bc3733c5b88aa4997d99e50d14e20ea0" Oct 03 14:57:24 crc kubenswrapper[4774]: I1003 14:57:24.379737 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9hqmr" Oct 03 14:57:24 crc kubenswrapper[4774]: I1003 14:57:24.382254 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ndcm4" event={"ID":"a49f20fd-6fea-477b-afc6-baf8b0393115","Type":"ContainerStarted","Data":"db544a9d6175800b520f7f34d220b104b4e146b285ed5d88b7abcbe37ef4484e"} Oct 03 14:57:24 crc kubenswrapper[4774]: I1003 14:57:24.408629 4774 scope.go:117] "RemoveContainer" containerID="62d696bd0a919bba73843a87fbb33cdff2b10e172ab60e95d23ed8d577980d76" Oct 03 14:57:24 crc kubenswrapper[4774]: I1003 14:57:24.419693 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ndcm4" podStartSLOduration=3.01509342 podStartE2EDuration="5.419668994s" podCreationTimestamp="2025-10-03 14:57:19 +0000 UTC" firstStartedPulling="2025-10-03 14:57:21.352655987 +0000 UTC m=+863.941859479" lastFinishedPulling="2025-10-03 14:57:23.757231591 +0000 UTC m=+866.346435053" observedRunningTime="2025-10-03 14:57:24.400873975 +0000 UTC m=+866.990077487" watchObservedRunningTime="2025-10-03 14:57:24.419668994 +0000 UTC m=+867.008872436" Oct 03 14:57:24 crc kubenswrapper[4774]: I1003 14:57:24.421303 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9hqmr"] Oct 03 14:57:24 crc kubenswrapper[4774]: I1003 14:57:24.426537 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9hqmr"] Oct 03 14:57:24 crc kubenswrapper[4774]: I1003 14:57:24.453865 4774 scope.go:117] "RemoveContainer" containerID="caa9a7ea4820f7ceac3863a7cbdc2b667814de90d082911b621928fa68f765cb" Oct 03 14:57:24 crc kubenswrapper[4774]: I1003 14:57:24.982162 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qjrm4"] Oct 03 14:57:25 crc kubenswrapper[4774]: I1003 14:57:25.322223 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cb629a3-abef-436c-a44d-7b51d921d68e" path="/var/lib/kubelet/pods/4cb629a3-abef-436c-a44d-7b51d921d68e/volumes" Oct 03 14:57:25 crc kubenswrapper[4774]: I1003 14:57:25.391442 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qjrm4" podUID="97c881ca-6fb5-4f44-bf10-927f21fec469" containerName="registry-server" containerID="cri-o://959488775ff4b99265aae411015239d73d9f69af761ea214966e461eae75abf4" gracePeriod=2 Oct 03 14:57:25 crc kubenswrapper[4774]: I1003 14:57:25.802845 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qjrm4" Oct 03 14:57:25 crc kubenswrapper[4774]: I1003 14:57:25.844130 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97c881ca-6fb5-4f44-bf10-927f21fec469-utilities\") pod \"97c881ca-6fb5-4f44-bf10-927f21fec469\" (UID: \"97c881ca-6fb5-4f44-bf10-927f21fec469\") " Oct 03 14:57:25 crc kubenswrapper[4774]: I1003 14:57:25.844224 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8x6d\" (UniqueName: \"kubernetes.io/projected/97c881ca-6fb5-4f44-bf10-927f21fec469-kube-api-access-c8x6d\") pod \"97c881ca-6fb5-4f44-bf10-927f21fec469\" (UID: \"97c881ca-6fb5-4f44-bf10-927f21fec469\") " Oct 03 14:57:25 crc kubenswrapper[4774]: I1003 14:57:25.844444 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97c881ca-6fb5-4f44-bf10-927f21fec469-catalog-content\") pod \"97c881ca-6fb5-4f44-bf10-927f21fec469\" (UID: \"97c881ca-6fb5-4f44-bf10-927f21fec469\") " Oct 03 14:57:25 crc kubenswrapper[4774]: I1003 14:57:25.845615 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97c881ca-6fb5-4f44-bf10-927f21fec469-utilities" (OuterVolumeSpecName: "utilities") pod "97c881ca-6fb5-4f44-bf10-927f21fec469" (UID: "97c881ca-6fb5-4f44-bf10-927f21fec469"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:57:25 crc kubenswrapper[4774]: I1003 14:57:25.850151 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97c881ca-6fb5-4f44-bf10-927f21fec469-kube-api-access-c8x6d" (OuterVolumeSpecName: "kube-api-access-c8x6d") pod "97c881ca-6fb5-4f44-bf10-927f21fec469" (UID: "97c881ca-6fb5-4f44-bf10-927f21fec469"). InnerVolumeSpecName "kube-api-access-c8x6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:57:25 crc kubenswrapper[4774]: I1003 14:57:25.928798 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97c881ca-6fb5-4f44-bf10-927f21fec469-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "97c881ca-6fb5-4f44-bf10-927f21fec469" (UID: "97c881ca-6fb5-4f44-bf10-927f21fec469"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:57:25 crc kubenswrapper[4774]: I1003 14:57:25.946202 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97c881ca-6fb5-4f44-bf10-927f21fec469-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:57:25 crc kubenswrapper[4774]: I1003 14:57:25.946236 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97c881ca-6fb5-4f44-bf10-927f21fec469-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:57:25 crc kubenswrapper[4774]: I1003 14:57:25.946245 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8x6d\" (UniqueName: \"kubernetes.io/projected/97c881ca-6fb5-4f44-bf10-927f21fec469-kube-api-access-c8x6d\") on node \"crc\" DevicePath \"\"" Oct 03 14:57:26 crc kubenswrapper[4774]: I1003 14:57:26.399890 4774 generic.go:334] "Generic (PLEG): container finished" podID="97c881ca-6fb5-4f44-bf10-927f21fec469" containerID="959488775ff4b99265aae411015239d73d9f69af761ea214966e461eae75abf4" exitCode=0 Oct 03 14:57:26 crc kubenswrapper[4774]: I1003 14:57:26.399942 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjrm4" event={"ID":"97c881ca-6fb5-4f44-bf10-927f21fec469","Type":"ContainerDied","Data":"959488775ff4b99265aae411015239d73d9f69af761ea214966e461eae75abf4"} Oct 03 14:57:26 crc kubenswrapper[4774]: I1003 14:57:26.399989 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjrm4" event={"ID":"97c881ca-6fb5-4f44-bf10-927f21fec469","Type":"ContainerDied","Data":"7e912406d692867b3fa4be12601aae00bd841a2d5f5c82fc557a28fd90e3e8c3"} Oct 03 14:57:26 crc kubenswrapper[4774]: I1003 14:57:26.400016 4774 scope.go:117] "RemoveContainer" containerID="959488775ff4b99265aae411015239d73d9f69af761ea214966e461eae75abf4" Oct 03 14:57:26 crc kubenswrapper[4774]: I1003 14:57:26.400032 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qjrm4" Oct 03 14:57:26 crc kubenswrapper[4774]: I1003 14:57:26.415840 4774 scope.go:117] "RemoveContainer" containerID="5433900289c8a5c4069f2aa77b292558b1a64b0d3f7d834320cefd41c8facaba" Oct 03 14:57:26 crc kubenswrapper[4774]: I1003 14:57:26.430001 4774 scope.go:117] "RemoveContainer" containerID="2bfeb9dc80edf7a0a764d0f24e70b4fa83118ad0f5d82f609359f3e2d7516c96" Oct 03 14:57:26 crc kubenswrapper[4774]: I1003 14:57:26.450835 4774 scope.go:117] "RemoveContainer" containerID="959488775ff4b99265aae411015239d73d9f69af761ea214966e461eae75abf4" Oct 03 14:57:26 crc kubenswrapper[4774]: E1003 14:57:26.453966 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"959488775ff4b99265aae411015239d73d9f69af761ea214966e461eae75abf4\": container with ID starting with 959488775ff4b99265aae411015239d73d9f69af761ea214966e461eae75abf4 not found: ID does not exist" containerID="959488775ff4b99265aae411015239d73d9f69af761ea214966e461eae75abf4" Oct 03 14:57:26 crc kubenswrapper[4774]: I1003 14:57:26.454061 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"959488775ff4b99265aae411015239d73d9f69af761ea214966e461eae75abf4"} err="failed to get container status \"959488775ff4b99265aae411015239d73d9f69af761ea214966e461eae75abf4\": rpc error: code = NotFound desc = could not find container \"959488775ff4b99265aae411015239d73d9f69af761ea214966e461eae75abf4\": container with ID starting with 959488775ff4b99265aae411015239d73d9f69af761ea214966e461eae75abf4 not found: ID does not exist" Oct 03 14:57:26 crc kubenswrapper[4774]: I1003 14:57:26.454112 4774 scope.go:117] "RemoveContainer" containerID="5433900289c8a5c4069f2aa77b292558b1a64b0d3f7d834320cefd41c8facaba" Oct 03 14:57:26 crc kubenswrapper[4774]: E1003 14:57:26.454562 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5433900289c8a5c4069f2aa77b292558b1a64b0d3f7d834320cefd41c8facaba\": container with ID starting with 5433900289c8a5c4069f2aa77b292558b1a64b0d3f7d834320cefd41c8facaba not found: ID does not exist" containerID="5433900289c8a5c4069f2aa77b292558b1a64b0d3f7d834320cefd41c8facaba" Oct 03 14:57:26 crc kubenswrapper[4774]: I1003 14:57:26.454583 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5433900289c8a5c4069f2aa77b292558b1a64b0d3f7d834320cefd41c8facaba"} err="failed to get container status \"5433900289c8a5c4069f2aa77b292558b1a64b0d3f7d834320cefd41c8facaba\": rpc error: code = NotFound desc = could not find container \"5433900289c8a5c4069f2aa77b292558b1a64b0d3f7d834320cefd41c8facaba\": container with ID starting with 5433900289c8a5c4069f2aa77b292558b1a64b0d3f7d834320cefd41c8facaba not found: ID does not exist" Oct 03 14:57:26 crc kubenswrapper[4774]: I1003 14:57:26.454596 4774 scope.go:117] "RemoveContainer" containerID="2bfeb9dc80edf7a0a764d0f24e70b4fa83118ad0f5d82f609359f3e2d7516c96" Oct 03 14:57:26 crc kubenswrapper[4774]: E1003 14:57:26.454837 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bfeb9dc80edf7a0a764d0f24e70b4fa83118ad0f5d82f609359f3e2d7516c96\": container with ID starting with 2bfeb9dc80edf7a0a764d0f24e70b4fa83118ad0f5d82f609359f3e2d7516c96 not found: ID does not exist" containerID="2bfeb9dc80edf7a0a764d0f24e70b4fa83118ad0f5d82f609359f3e2d7516c96" Oct 03 14:57:26 crc kubenswrapper[4774]: I1003 14:57:26.454854 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bfeb9dc80edf7a0a764d0f24e70b4fa83118ad0f5d82f609359f3e2d7516c96"} err="failed to get container status \"2bfeb9dc80edf7a0a764d0f24e70b4fa83118ad0f5d82f609359f3e2d7516c96\": rpc error: code = NotFound desc = could not find container \"2bfeb9dc80edf7a0a764d0f24e70b4fa83118ad0f5d82f609359f3e2d7516c96\": container with ID starting with 2bfeb9dc80edf7a0a764d0f24e70b4fa83118ad0f5d82f609359f3e2d7516c96 not found: ID does not exist" Oct 03 14:57:26 crc kubenswrapper[4774]: I1003 14:57:26.457554 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qjrm4"] Oct 03 14:57:26 crc kubenswrapper[4774]: I1003 14:57:26.461937 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qjrm4"] Oct 03 14:57:27 crc kubenswrapper[4774]: I1003 14:57:27.309538 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97c881ca-6fb5-4f44-bf10-927f21fec469" path="/var/lib/kubelet/pods/97c881ca-6fb5-4f44-bf10-927f21fec469/volumes" Oct 03 14:57:29 crc kubenswrapper[4774]: I1003 14:57:29.914923 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ndcm4" Oct 03 14:57:29 crc kubenswrapper[4774]: I1003 14:57:29.915217 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ndcm4" Oct 03 14:57:29 crc kubenswrapper[4774]: I1003 14:57:29.981394 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ndcm4" Oct 03 14:57:30 crc kubenswrapper[4774]: I1003 14:57:30.498262 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ndcm4" Oct 03 14:57:31 crc kubenswrapper[4774]: I1003 14:57:31.587193 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ndcm4"] Oct 03 14:57:32 crc kubenswrapper[4774]: I1003 14:57:32.439894 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ndcm4" podUID="a49f20fd-6fea-477b-afc6-baf8b0393115" containerName="registry-server" containerID="cri-o://db544a9d6175800b520f7f34d220b104b4e146b285ed5d88b7abcbe37ef4484e" gracePeriod=2 Oct 03 14:57:33 crc kubenswrapper[4774]: I1003 14:57:33.450417 4774 generic.go:334] "Generic (PLEG): container finished" podID="a49f20fd-6fea-477b-afc6-baf8b0393115" containerID="db544a9d6175800b520f7f34d220b104b4e146b285ed5d88b7abcbe37ef4484e" exitCode=0 Oct 03 14:57:33 crc kubenswrapper[4774]: I1003 14:57:33.450501 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ndcm4" event={"ID":"a49f20fd-6fea-477b-afc6-baf8b0393115","Type":"ContainerDied","Data":"db544a9d6175800b520f7f34d220b104b4e146b285ed5d88b7abcbe37ef4484e"} Oct 03 14:57:33 crc kubenswrapper[4774]: I1003 14:57:33.717788 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ndcm4" Oct 03 14:57:33 crc kubenswrapper[4774]: I1003 14:57:33.745872 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjqvg\" (UniqueName: \"kubernetes.io/projected/a49f20fd-6fea-477b-afc6-baf8b0393115-kube-api-access-cjqvg\") pod \"a49f20fd-6fea-477b-afc6-baf8b0393115\" (UID: \"a49f20fd-6fea-477b-afc6-baf8b0393115\") " Oct 03 14:57:33 crc kubenswrapper[4774]: I1003 14:57:33.745912 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a49f20fd-6fea-477b-afc6-baf8b0393115-catalog-content\") pod \"a49f20fd-6fea-477b-afc6-baf8b0393115\" (UID: \"a49f20fd-6fea-477b-afc6-baf8b0393115\") " Oct 03 14:57:33 crc kubenswrapper[4774]: I1003 14:57:33.745969 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a49f20fd-6fea-477b-afc6-baf8b0393115-utilities\") pod \"a49f20fd-6fea-477b-afc6-baf8b0393115\" (UID: \"a49f20fd-6fea-477b-afc6-baf8b0393115\") " Oct 03 14:57:33 crc kubenswrapper[4774]: I1003 14:57:33.746977 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a49f20fd-6fea-477b-afc6-baf8b0393115-utilities" (OuterVolumeSpecName: "utilities") pod "a49f20fd-6fea-477b-afc6-baf8b0393115" (UID: "a49f20fd-6fea-477b-afc6-baf8b0393115"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:57:33 crc kubenswrapper[4774]: I1003 14:57:33.751238 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a49f20fd-6fea-477b-afc6-baf8b0393115-kube-api-access-cjqvg" (OuterVolumeSpecName: "kube-api-access-cjqvg") pod "a49f20fd-6fea-477b-afc6-baf8b0393115" (UID: "a49f20fd-6fea-477b-afc6-baf8b0393115"). InnerVolumeSpecName "kube-api-access-cjqvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:57:33 crc kubenswrapper[4774]: I1003 14:57:33.847310 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a49f20fd-6fea-477b-afc6-baf8b0393115-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:57:33 crc kubenswrapper[4774]: I1003 14:57:33.847619 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjqvg\" (UniqueName: \"kubernetes.io/projected/a49f20fd-6fea-477b-afc6-baf8b0393115-kube-api-access-cjqvg\") on node \"crc\" DevicePath \"\"" Oct 03 14:57:34 crc kubenswrapper[4774]: I1003 14:57:34.458255 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ndcm4" event={"ID":"a49f20fd-6fea-477b-afc6-baf8b0393115","Type":"ContainerDied","Data":"5bba7072f13bab9fa60b5bece6473e6ce89688ea33a3ac4b6b1c4ef626cedc9a"} Oct 03 14:57:34 crc kubenswrapper[4774]: I1003 14:57:34.458314 4774 scope.go:117] "RemoveContainer" containerID="db544a9d6175800b520f7f34d220b104b4e146b285ed5d88b7abcbe37ef4484e" Oct 03 14:57:34 crc kubenswrapper[4774]: I1003 14:57:34.458316 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ndcm4" Oct 03 14:57:34 crc kubenswrapper[4774]: I1003 14:57:34.485705 4774 scope.go:117] "RemoveContainer" containerID="9dd08f6ee9af3e6f4e70bfe2d14fc5a80b3343a4315c4399c995d892fd15e5fe" Oct 03 14:57:34 crc kubenswrapper[4774]: I1003 14:57:34.506148 4774 scope.go:117] "RemoveContainer" containerID="afaf9cb06cf2035219781884d833a533977afad847cc08594047f04ff42325cd" Oct 03 14:57:35 crc kubenswrapper[4774]: I1003 14:57:35.045080 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a49f20fd-6fea-477b-afc6-baf8b0393115-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a49f20fd-6fea-477b-afc6-baf8b0393115" (UID: "a49f20fd-6fea-477b-afc6-baf8b0393115"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:57:35 crc kubenswrapper[4774]: I1003 14:57:35.068265 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a49f20fd-6fea-477b-afc6-baf8b0393115-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:57:35 crc kubenswrapper[4774]: I1003 14:57:35.094747 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ndcm4"] Oct 03 14:57:35 crc kubenswrapper[4774]: I1003 14:57:35.101636 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ndcm4"] Oct 03 14:57:35 crc kubenswrapper[4774]: I1003 14:57:35.310823 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a49f20fd-6fea-477b-afc6-baf8b0393115" path="/var/lib/kubelet/pods/a49f20fd-6fea-477b-afc6-baf8b0393115/volumes" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.625428 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6c675fb79f-b54rk"] Oct 03 14:57:53 crc kubenswrapper[4774]: E1003 14:57:53.626246 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cb629a3-abef-436c-a44d-7b51d921d68e" containerName="registry-server" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.626264 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cb629a3-abef-436c-a44d-7b51d921d68e" containerName="registry-server" Oct 03 14:57:53 crc kubenswrapper[4774]: E1003 14:57:53.626286 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97c881ca-6fb5-4f44-bf10-927f21fec469" containerName="registry-server" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.626294 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="97c881ca-6fb5-4f44-bf10-927f21fec469" containerName="registry-server" Oct 03 14:57:53 crc kubenswrapper[4774]: E1003 14:57:53.626305 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a49f20fd-6fea-477b-afc6-baf8b0393115" containerName="extract-utilities" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.626313 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="a49f20fd-6fea-477b-afc6-baf8b0393115" containerName="extract-utilities" Oct 03 14:57:53 crc kubenswrapper[4774]: E1003 14:57:53.626328 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97c881ca-6fb5-4f44-bf10-927f21fec469" containerName="extract-utilities" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.626335 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="97c881ca-6fb5-4f44-bf10-927f21fec469" containerName="extract-utilities" Oct 03 14:57:53 crc kubenswrapper[4774]: E1003 14:57:53.626349 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a49f20fd-6fea-477b-afc6-baf8b0393115" containerName="registry-server" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.626356 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="a49f20fd-6fea-477b-afc6-baf8b0393115" containerName="registry-server" Oct 03 14:57:53 crc kubenswrapper[4774]: E1003 14:57:53.626391 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cb629a3-abef-436c-a44d-7b51d921d68e" containerName="extract-utilities" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.626399 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cb629a3-abef-436c-a44d-7b51d921d68e" containerName="extract-utilities" Oct 03 14:57:53 crc kubenswrapper[4774]: E1003 14:57:53.626414 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97c881ca-6fb5-4f44-bf10-927f21fec469" containerName="extract-content" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.626424 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="97c881ca-6fb5-4f44-bf10-927f21fec469" containerName="extract-content" Oct 03 14:57:53 crc kubenswrapper[4774]: E1003 14:57:53.626437 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cb629a3-abef-436c-a44d-7b51d921d68e" containerName="extract-content" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.626444 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cb629a3-abef-436c-a44d-7b51d921d68e" containerName="extract-content" Oct 03 14:57:53 crc kubenswrapper[4774]: E1003 14:57:53.626455 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a49f20fd-6fea-477b-afc6-baf8b0393115" containerName="extract-content" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.626463 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="a49f20fd-6fea-477b-afc6-baf8b0393115" containerName="extract-content" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.626591 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="a49f20fd-6fea-477b-afc6-baf8b0393115" containerName="registry-server" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.626607 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="97c881ca-6fb5-4f44-bf10-927f21fec469" containerName="registry-server" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.626624 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cb629a3-abef-436c-a44d-7b51d921d68e" containerName="registry-server" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.627416 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-b54rk" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.631049 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-k577z" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.638563 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6c675fb79f-b54rk"] Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.645540 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79d68d6c85-6gckj"] Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.646670 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-6gckj" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.652231 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-8rdq7" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.661434 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-m2wmk"] Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.662396 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-m2wmk" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.664808 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-h4tkc" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.676363 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-846dff85b5-v5kkx"] Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.678459 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-v5kkx" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.683736 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-vnlbj" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.693736 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-846dff85b5-v5kkx"] Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.697751 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-599898f689-mhz7w"] Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.698744 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-599898f689-mhz7w" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.700149 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-892pp" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.711955 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-449n6\" (UniqueName: \"kubernetes.io/projected/cea60414-e959-4200-b3e5-e532d2136047-kube-api-access-449n6\") pod \"barbican-operator-controller-manager-6c675fb79f-b54rk\" (UID: \"cea60414-e959-4200-b3e5-e532d2136047\") " pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-b54rk" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.712045 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw2j9\" (UniqueName: \"kubernetes.io/projected/4cf018fb-edab-4e23-ad04-763ee25e1613-kube-api-access-lw2j9\") pod \"designate-operator-controller-manager-75dfd9b554-m2wmk\" (UID: \"4cf018fb-edab-4e23-ad04-763ee25e1613\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-m2wmk" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.712080 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8mr9\" (UniqueName: \"kubernetes.io/projected/29905a36-139e-4611-bc8e-0289dd1fa0b4-kube-api-access-h8mr9\") pod \"glance-operator-controller-manager-846dff85b5-v5kkx\" (UID: \"29905a36-139e-4611-bc8e-0289dd1fa0b4\") " pod="openstack-operators/glance-operator-controller-manager-846dff85b5-v5kkx" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.712105 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdc9l\" (UniqueName: \"kubernetes.io/projected/4c3d1495-6568-44c2-9bd7-82256a4b5aab-kube-api-access-rdc9l\") pod \"cinder-operator-controller-manager-79d68d6c85-6gckj\" (UID: \"4c3d1495-6568-44c2-9bd7-82256a4b5aab\") " pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-6gckj" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.714705 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79d68d6c85-6gckj"] Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.725437 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6769b867d9-2tqv8"] Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.726436 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-2tqv8" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.729754 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-9kl44" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.741110 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-599898f689-mhz7w"] Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.747563 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-m2wmk"] Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.751276 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5fbf469cd7-zhmr8"] Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.752401 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-zhmr8" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.755906 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6769b867d9-2tqv8"] Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.760763 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-lbvt9" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.760892 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.774430 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5fbf469cd7-zhmr8"] Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.791207 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-84bc9db6cc-2sg6f"] Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.792099 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-2sg6f" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.804121 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-7v9wz" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.808457 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7f55849f88-fw4zb"] Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.809326 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-fw4zb" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.813148 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhd4d\" (UniqueName: \"kubernetes.io/projected/bc3a311f-a6c2-40e4-aaae-549aa2395c57-kube-api-access-fhd4d\") pod \"infra-operator-controller-manager-5fbf469cd7-zhmr8\" (UID: \"bc3a311f-a6c2-40e4-aaae-549aa2395c57\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-zhmr8" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.813234 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlbns\" (UniqueName: \"kubernetes.io/projected/02bdd1b6-4d8f-40ce-b0fe-449c738d5d0e-kube-api-access-nlbns\") pod \"horizon-operator-controller-manager-6769b867d9-2tqv8\" (UID: \"02bdd1b6-4d8f-40ce-b0fe-449c738d5d0e\") " pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-2tqv8" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.813272 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw2j9\" (UniqueName: \"kubernetes.io/projected/4cf018fb-edab-4e23-ad04-763ee25e1613-kube-api-access-lw2j9\") pod \"designate-operator-controller-manager-75dfd9b554-m2wmk\" (UID: \"4cf018fb-edab-4e23-ad04-763ee25e1613\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-m2wmk" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.813311 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc3a311f-a6c2-40e4-aaae-549aa2395c57-cert\") pod \"infra-operator-controller-manager-5fbf469cd7-zhmr8\" (UID: \"bc3a311f-a6c2-40e4-aaae-549aa2395c57\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-zhmr8" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.813341 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8mr9\" (UniqueName: \"kubernetes.io/projected/29905a36-139e-4611-bc8e-0289dd1fa0b4-kube-api-access-h8mr9\") pod \"glance-operator-controller-manager-846dff85b5-v5kkx\" (UID: \"29905a36-139e-4611-bc8e-0289dd1fa0b4\") " pod="openstack-operators/glance-operator-controller-manager-846dff85b5-v5kkx" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.813440 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdc9l\" (UniqueName: \"kubernetes.io/projected/4c3d1495-6568-44c2-9bd7-82256a4b5aab-kube-api-access-rdc9l\") pod \"cinder-operator-controller-manager-79d68d6c85-6gckj\" (UID: \"4c3d1495-6568-44c2-9bd7-82256a4b5aab\") " pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-6gckj" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.813465 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwr2m\" (UniqueName: \"kubernetes.io/projected/ecf9de8d-83d4-41cb-9b00-e3aeedfb93fd-kube-api-access-kwr2m\") pod \"heat-operator-controller-manager-599898f689-mhz7w\" (UID: \"ecf9de8d-83d4-41cb-9b00-e3aeedfb93fd\") " pod="openstack-operators/heat-operator-controller-manager-599898f689-mhz7w" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.813497 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-449n6\" (UniqueName: \"kubernetes.io/projected/cea60414-e959-4200-b3e5-e532d2136047-kube-api-access-449n6\") pod \"barbican-operator-controller-manager-6c675fb79f-b54rk\" (UID: \"cea60414-e959-4200-b3e5-e532d2136047\") " pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-b54rk" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.820636 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-84bc9db6cc-2sg6f"] Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.824395 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-qc2rp" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.842659 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7f55849f88-fw4zb"] Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.852763 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6fd6854b49-24bvc"] Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.853778 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-24bvc" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.859614 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-6ng8c" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.869326 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdc9l\" (UniqueName: \"kubernetes.io/projected/4c3d1495-6568-44c2-9bd7-82256a4b5aab-kube-api-access-rdc9l\") pod \"cinder-operator-controller-manager-79d68d6c85-6gckj\" (UID: \"4c3d1495-6568-44c2-9bd7-82256a4b5aab\") " pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-6gckj" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.871940 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-449n6\" (UniqueName: \"kubernetes.io/projected/cea60414-e959-4200-b3e5-e532d2136047-kube-api-access-449n6\") pod \"barbican-operator-controller-manager-6c675fb79f-b54rk\" (UID: \"cea60414-e959-4200-b3e5-e532d2136047\") " pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-b54rk" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.874870 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw2j9\" (UniqueName: \"kubernetes.io/projected/4cf018fb-edab-4e23-ad04-763ee25e1613-kube-api-access-lw2j9\") pod \"designate-operator-controller-manager-75dfd9b554-m2wmk\" (UID: \"4cf018fb-edab-4e23-ad04-763ee25e1613\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-m2wmk" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.874927 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-mtjh7"] Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.875845 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-mtjh7" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.882746 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-s8vcg" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.889479 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6fd6854b49-24bvc"] Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.916019 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktjw4\" (UniqueName: \"kubernetes.io/projected/6edea7f2-581f-4f41-bdda-45e83dce680d-kube-api-access-ktjw4\") pod \"mariadb-operator-controller-manager-5c468bf4d4-mtjh7\" (UID: \"6edea7f2-581f-4f41-bdda-45e83dce680d\") " pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-mtjh7" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.916085 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn2hb\" (UniqueName: \"kubernetes.io/projected/741c17eb-da65-4dce-abc6-7faa47d28004-kube-api-access-tn2hb\") pod \"ironic-operator-controller-manager-84bc9db6cc-2sg6f\" (UID: \"741c17eb-da65-4dce-abc6-7faa47d28004\") " pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-2sg6f" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.916113 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d644k\" (UniqueName: \"kubernetes.io/projected/b32ca090-1129-4c77-a2b3-df9e51a35a48-kube-api-access-d644k\") pod \"keystone-operator-controller-manager-7f55849f88-fw4zb\" (UID: \"b32ca090-1129-4c77-a2b3-df9e51a35a48\") " pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-fw4zb" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.916133 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m24n7\" (UniqueName: \"kubernetes.io/projected/10674e8a-5afd-45f7-af36-e9dbfaf2dba0-kube-api-access-m24n7\") pod \"manila-operator-controller-manager-6fd6854b49-24bvc\" (UID: \"10674e8a-5afd-45f7-af36-e9dbfaf2dba0\") " pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-24bvc" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.916160 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhd4d\" (UniqueName: \"kubernetes.io/projected/bc3a311f-a6c2-40e4-aaae-549aa2395c57-kube-api-access-fhd4d\") pod \"infra-operator-controller-manager-5fbf469cd7-zhmr8\" (UID: \"bc3a311f-a6c2-40e4-aaae-549aa2395c57\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-zhmr8" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.916182 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlbns\" (UniqueName: \"kubernetes.io/projected/02bdd1b6-4d8f-40ce-b0fe-449c738d5d0e-kube-api-access-nlbns\") pod \"horizon-operator-controller-manager-6769b867d9-2tqv8\" (UID: \"02bdd1b6-4d8f-40ce-b0fe-449c738d5d0e\") " pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-2tqv8" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.916209 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc3a311f-a6c2-40e4-aaae-549aa2395c57-cert\") pod \"infra-operator-controller-manager-5fbf469cd7-zhmr8\" (UID: \"bc3a311f-a6c2-40e4-aaae-549aa2395c57\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-zhmr8" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.916245 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwr2m\" (UniqueName: \"kubernetes.io/projected/ecf9de8d-83d4-41cb-9b00-e3aeedfb93fd-kube-api-access-kwr2m\") pod \"heat-operator-controller-manager-599898f689-mhz7w\" (UID: \"ecf9de8d-83d4-41cb-9b00-e3aeedfb93fd\") " pod="openstack-operators/heat-operator-controller-manager-599898f689-mhz7w" Oct 03 14:57:53 crc kubenswrapper[4774]: E1003 14:57:53.916653 4774 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 03 14:57:53 crc kubenswrapper[4774]: E1003 14:57:53.917121 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc3a311f-a6c2-40e4-aaae-549aa2395c57-cert podName:bc3a311f-a6c2-40e4-aaae-549aa2395c57 nodeName:}" failed. No retries permitted until 2025-10-03 14:57:54.417101995 +0000 UTC m=+897.006305447 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bc3a311f-a6c2-40e4-aaae-549aa2395c57-cert") pod "infra-operator-controller-manager-5fbf469cd7-zhmr8" (UID: "bc3a311f-a6c2-40e4-aaae-549aa2395c57") : secret "infra-operator-webhook-server-cert" not found Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.919014 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-mtjh7"] Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.942685 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8mr9\" (UniqueName: \"kubernetes.io/projected/29905a36-139e-4611-bc8e-0289dd1fa0b4-kube-api-access-h8mr9\") pod \"glance-operator-controller-manager-846dff85b5-v5kkx\" (UID: \"29905a36-139e-4611-bc8e-0289dd1fa0b4\") " pod="openstack-operators/glance-operator-controller-manager-846dff85b5-v5kkx" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.952384 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-b54rk" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.963924 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhd4d\" (UniqueName: \"kubernetes.io/projected/bc3a311f-a6c2-40e4-aaae-549aa2395c57-kube-api-access-fhd4d\") pod \"infra-operator-controller-manager-5fbf469cd7-zhmr8\" (UID: \"bc3a311f-a6c2-40e4-aaae-549aa2395c57\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-zhmr8" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.969667 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-6gckj" Oct 03 14:57:53 crc kubenswrapper[4774]: I1003 14:57:53.991955 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-m2wmk" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.002918 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-v5kkx" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.006947 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-59d6cfdf45-84vwj"] Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.054614 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlbns\" (UniqueName: \"kubernetes.io/projected/02bdd1b6-4d8f-40ce-b0fe-449c738d5d0e-kube-api-access-nlbns\") pod \"horizon-operator-controller-manager-6769b867d9-2tqv8\" (UID: \"02bdd1b6-4d8f-40ce-b0fe-449c738d5d0e\") " pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-2tqv8" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.056315 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-84vwj" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.058926 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn2hb\" (UniqueName: \"kubernetes.io/projected/741c17eb-da65-4dce-abc6-7faa47d28004-kube-api-access-tn2hb\") pod \"ironic-operator-controller-manager-84bc9db6cc-2sg6f\" (UID: \"741c17eb-da65-4dce-abc6-7faa47d28004\") " pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-2sg6f" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.058972 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d644k\" (UniqueName: \"kubernetes.io/projected/b32ca090-1129-4c77-a2b3-df9e51a35a48-kube-api-access-d644k\") pod \"keystone-operator-controller-manager-7f55849f88-fw4zb\" (UID: \"b32ca090-1129-4c77-a2b3-df9e51a35a48\") " pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-fw4zb" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.058995 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m24n7\" (UniqueName: \"kubernetes.io/projected/10674e8a-5afd-45f7-af36-e9dbfaf2dba0-kube-api-access-m24n7\") pod \"manila-operator-controller-manager-6fd6854b49-24bvc\" (UID: \"10674e8a-5afd-45f7-af36-e9dbfaf2dba0\") " pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-24bvc" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.059077 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktjw4\" (UniqueName: \"kubernetes.io/projected/6edea7f2-581f-4f41-bdda-45e83dce680d-kube-api-access-ktjw4\") pod \"mariadb-operator-controller-manager-5c468bf4d4-mtjh7\" (UID: \"6edea7f2-581f-4f41-bdda-45e83dce680d\") " pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-mtjh7" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.065552 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-tmsqt" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.097949 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwr2m\" (UniqueName: \"kubernetes.io/projected/ecf9de8d-83d4-41cb-9b00-e3aeedfb93fd-kube-api-access-kwr2m\") pod \"heat-operator-controller-manager-599898f689-mhz7w\" (UID: \"ecf9de8d-83d4-41cb-9b00-e3aeedfb93fd\") " pod="openstack-operators/heat-operator-controller-manager-599898f689-mhz7w" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.104078 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktjw4\" (UniqueName: \"kubernetes.io/projected/6edea7f2-581f-4f41-bdda-45e83dce680d-kube-api-access-ktjw4\") pod \"mariadb-operator-controller-manager-5c468bf4d4-mtjh7\" (UID: \"6edea7f2-581f-4f41-bdda-45e83dce680d\") " pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-mtjh7" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.113239 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m24n7\" (UniqueName: \"kubernetes.io/projected/10674e8a-5afd-45f7-af36-e9dbfaf2dba0-kube-api-access-m24n7\") pod \"manila-operator-controller-manager-6fd6854b49-24bvc\" (UID: \"10674e8a-5afd-45f7-af36-e9dbfaf2dba0\") " pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-24bvc" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.116157 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn2hb\" (UniqueName: \"kubernetes.io/projected/741c17eb-da65-4dce-abc6-7faa47d28004-kube-api-access-tn2hb\") pod \"ironic-operator-controller-manager-84bc9db6cc-2sg6f\" (UID: \"741c17eb-da65-4dce-abc6-7faa47d28004\") " pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-2sg6f" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.117477 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6574bf987d-nw7mh"] Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.118793 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-nw7mh" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.126736 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-zkh58" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.128920 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-24bvc" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.128972 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-2sg6f" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.132508 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d644k\" (UniqueName: \"kubernetes.io/projected/b32ca090-1129-4c77-a2b3-df9e51a35a48-kube-api-access-d644k\") pod \"keystone-operator-controller-manager-7f55849f88-fw4zb\" (UID: \"b32ca090-1129-4c77-a2b3-df9e51a35a48\") " pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-fw4zb" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.134661 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-555c7456bd-fhd72"] Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.135295 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-mtjh7" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.135753 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-fhd72" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.142023 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-8gr6r" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.154103 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-59d6cfdf45-84vwj"] Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.160147 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j4z6\" (UniqueName: \"kubernetes.io/projected/5c00d52a-acc5-4650-8b36-48faa90030a3-kube-api-access-5j4z6\") pod \"octavia-operator-controller-manager-59d6cfdf45-84vwj\" (UID: \"5c00d52a-acc5-4650-8b36-48faa90030a3\") " pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-84vwj" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.161784 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-555c7456bd-fhd72"] Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.179125 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6574bf987d-nw7mh"] Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.183150 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678chmhg"] Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.184514 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678chmhg" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.193120 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-55xpg" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.199016 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.206476 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678chmhg"] Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.219532 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-688db7b6c7-c7hxr"] Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.220782 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-7d8bb7f44c-pqrxt"] Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.221772 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-pqrxt" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.222224 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-c7hxr" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.227235 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-zcmj7" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.228444 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-gk94g" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.252514 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-j5d2d"] Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.253655 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-j5d2d" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.259451 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-t54xk" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.261566 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q5gh\" (UniqueName: \"kubernetes.io/projected/74855628-79e3-4300-8a8b-d05aeed1904b-kube-api-access-4q5gh\") pod \"nova-operator-controller-manager-555c7456bd-fhd72\" (UID: \"74855628-79e3-4300-8a8b-d05aeed1904b\") " pod="openstack-operators/nova-operator-controller-manager-555c7456bd-fhd72" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.261610 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b82b2ba-da6d-4441-a194-4b47207b159a-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678chmhg\" (UID: \"2b82b2ba-da6d-4441-a194-4b47207b159a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678chmhg" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.261637 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j4z6\" (UniqueName: \"kubernetes.io/projected/5c00d52a-acc5-4650-8b36-48faa90030a3-kube-api-access-5j4z6\") pod \"octavia-operator-controller-manager-59d6cfdf45-84vwj\" (UID: \"5c00d52a-acc5-4650-8b36-48faa90030a3\") " pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-84vwj" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.261694 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ljvh\" (UniqueName: \"kubernetes.io/projected/2b82b2ba-da6d-4441-a194-4b47207b159a-kube-api-access-7ljvh\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678chmhg\" (UID: \"2b82b2ba-da6d-4441-a194-4b47207b159a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678chmhg" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.261725 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpfdc\" (UniqueName: \"kubernetes.io/projected/eaf48bda-c7ca-484b-8d8f-b195d011e8f9-kube-api-access-qpfdc\") pod \"neutron-operator-controller-manager-6574bf987d-nw7mh\" (UID: \"eaf48bda-c7ca-484b-8d8f-b195d011e8f9\") " pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-nw7mh" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.270122 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-fw4zb" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.292059 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j4z6\" (UniqueName: \"kubernetes.io/projected/5c00d52a-acc5-4650-8b36-48faa90030a3-kube-api-access-5j4z6\") pod \"octavia-operator-controller-manager-59d6cfdf45-84vwj\" (UID: \"5c00d52a-acc5-4650-8b36-48faa90030a3\") " pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-84vwj" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.292127 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-688db7b6c7-c7hxr"] Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.315214 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-599898f689-mhz7w" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.319827 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-j5d2d"] Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.342401 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-2tqv8" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.347512 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-7d8bb7f44c-pqrxt"] Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.365028 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b82b2ba-da6d-4441-a194-4b47207b159a-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678chmhg\" (UID: \"2b82b2ba-da6d-4441-a194-4b47207b159a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678chmhg" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.365165 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ljvh\" (UniqueName: \"kubernetes.io/projected/2b82b2ba-da6d-4441-a194-4b47207b159a-kube-api-access-7ljvh\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678chmhg\" (UID: \"2b82b2ba-da6d-4441-a194-4b47207b159a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678chmhg" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.365193 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gsrj\" (UniqueName: \"kubernetes.io/projected/14fc26c3-ab56-44a5-832c-55eaca43cc5c-kube-api-access-8gsrj\") pod \"ovn-operator-controller-manager-688db7b6c7-c7hxr\" (UID: \"14fc26c3-ab56-44a5-832c-55eaca43cc5c\") " pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-c7hxr" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.365272 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpfdc\" (UniqueName: \"kubernetes.io/projected/eaf48bda-c7ca-484b-8d8f-b195d011e8f9-kube-api-access-qpfdc\") pod \"neutron-operator-controller-manager-6574bf987d-nw7mh\" (UID: \"eaf48bda-c7ca-484b-8d8f-b195d011e8f9\") " pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-nw7mh" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.365325 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgjd4\" (UniqueName: \"kubernetes.io/projected/35aefd42-2274-451a-8526-fb99c1f72be0-kube-api-access-dgjd4\") pod \"placement-operator-controller-manager-7d8bb7f44c-pqrxt\" (UID: \"35aefd42-2274-451a-8526-fb99c1f72be0\") " pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-pqrxt" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.365352 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcgdr\" (UniqueName: \"kubernetes.io/projected/e9d1e188-b3ff-4807-a57e-9bf290e10f22-kube-api-access-bcgdr\") pod \"swift-operator-controller-manager-6859f9b676-j5d2d\" (UID: \"e9d1e188-b3ff-4807-a57e-9bf290e10f22\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-j5d2d" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.365372 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q5gh\" (UniqueName: \"kubernetes.io/projected/74855628-79e3-4300-8a8b-d05aeed1904b-kube-api-access-4q5gh\") pod \"nova-operator-controller-manager-555c7456bd-fhd72\" (UID: \"74855628-79e3-4300-8a8b-d05aeed1904b\") " pod="openstack-operators/nova-operator-controller-manager-555c7456bd-fhd72" Oct 03 14:57:54 crc kubenswrapper[4774]: E1003 14:57:54.365715 4774 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 03 14:57:54 crc kubenswrapper[4774]: E1003 14:57:54.365754 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b82b2ba-da6d-4441-a194-4b47207b159a-cert podName:2b82b2ba-da6d-4441-a194-4b47207b159a nodeName:}" failed. No retries permitted until 2025-10-03 14:57:54.865740891 +0000 UTC m=+897.454944343 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2b82b2ba-da6d-4441-a194-4b47207b159a-cert") pod "openstack-baremetal-operator-controller-manager-6f64c4d678chmhg" (UID: "2b82b2ba-da6d-4441-a194-4b47207b159a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.400584 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q5gh\" (UniqueName: \"kubernetes.io/projected/74855628-79e3-4300-8a8b-d05aeed1904b-kube-api-access-4q5gh\") pod \"nova-operator-controller-manager-555c7456bd-fhd72\" (UID: \"74855628-79e3-4300-8a8b-d05aeed1904b\") " pod="openstack-operators/nova-operator-controller-manager-555c7456bd-fhd72" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.400642 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5db5cf686f-zpq8n"] Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.401666 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-zpq8n" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.401818 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ljvh\" (UniqueName: \"kubernetes.io/projected/2b82b2ba-da6d-4441-a194-4b47207b159a-kube-api-access-7ljvh\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678chmhg\" (UID: \"2b82b2ba-da6d-4441-a194-4b47207b159a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678chmhg" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.409661 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-s54zb" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.422228 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpfdc\" (UniqueName: \"kubernetes.io/projected/eaf48bda-c7ca-484b-8d8f-b195d011e8f9-kube-api-access-qpfdc\") pod \"neutron-operator-controller-manager-6574bf987d-nw7mh\" (UID: \"eaf48bda-c7ca-484b-8d8f-b195d011e8f9\") " pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-nw7mh" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.443021 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-ftzfn"] Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.446968 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-ftzfn" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.449748 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5db5cf686f-zpq8n"] Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.450815 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-4cslt" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.462718 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-84vwj" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.465863 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc3a311f-a6c2-40e4-aaae-549aa2395c57-cert\") pod \"infra-operator-controller-manager-5fbf469cd7-zhmr8\" (UID: \"bc3a311f-a6c2-40e4-aaae-549aa2395c57\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-zhmr8" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.465899 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgjd4\" (UniqueName: \"kubernetes.io/projected/35aefd42-2274-451a-8526-fb99c1f72be0-kube-api-access-dgjd4\") pod \"placement-operator-controller-manager-7d8bb7f44c-pqrxt\" (UID: \"35aefd42-2274-451a-8526-fb99c1f72be0\") " pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-pqrxt" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.465931 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcgdr\" (UniqueName: \"kubernetes.io/projected/e9d1e188-b3ff-4807-a57e-9bf290e10f22-kube-api-access-bcgdr\") pod \"swift-operator-controller-manager-6859f9b676-j5d2d\" (UID: \"e9d1e188-b3ff-4807-a57e-9bf290e10f22\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-j5d2d" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.465979 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76qnj\" (UniqueName: \"kubernetes.io/projected/fcb0af6a-547c-4555-86f9-f0b390ae7ce3-kube-api-access-76qnj\") pod \"telemetry-operator-controller-manager-5db5cf686f-zpq8n\" (UID: \"fcb0af6a-547c-4555-86f9-f0b390ae7ce3\") " pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-zpq8n" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.466015 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gsrj\" (UniqueName: \"kubernetes.io/projected/14fc26c3-ab56-44a5-832c-55eaca43cc5c-kube-api-access-8gsrj\") pod \"ovn-operator-controller-manager-688db7b6c7-c7hxr\" (UID: \"14fc26c3-ab56-44a5-832c-55eaca43cc5c\") " pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-c7hxr" Oct 03 14:57:54 crc kubenswrapper[4774]: E1003 14:57:54.466302 4774 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 03 14:57:54 crc kubenswrapper[4774]: E1003 14:57:54.466340 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc3a311f-a6c2-40e4-aaae-549aa2395c57-cert podName:bc3a311f-a6c2-40e4-aaae-549aa2395c57 nodeName:}" failed. No retries permitted until 2025-10-03 14:57:55.466327387 +0000 UTC m=+898.055530839 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bc3a311f-a6c2-40e4-aaae-549aa2395c57-cert") pod "infra-operator-controller-manager-5fbf469cd7-zhmr8" (UID: "bc3a311f-a6c2-40e4-aaae-549aa2395c57") : secret "infra-operator-webhook-server-cert" not found Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.468491 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-ftzfn"] Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.486031 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-nw7mh" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.503065 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gsrj\" (UniqueName: \"kubernetes.io/projected/14fc26c3-ab56-44a5-832c-55eaca43cc5c-kube-api-access-8gsrj\") pod \"ovn-operator-controller-manager-688db7b6c7-c7hxr\" (UID: \"14fc26c3-ab56-44a5-832c-55eaca43cc5c\") " pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-c7hxr" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.509241 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-fcd7d9895-ngbtk"] Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.510243 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-ngbtk" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.511702 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgjd4\" (UniqueName: \"kubernetes.io/projected/35aefd42-2274-451a-8526-fb99c1f72be0-kube-api-access-dgjd4\") pod \"placement-operator-controller-manager-7d8bb7f44c-pqrxt\" (UID: \"35aefd42-2274-451a-8526-fb99c1f72be0\") " pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-pqrxt" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.513549 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-q2q6r" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.523186 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcgdr\" (UniqueName: \"kubernetes.io/projected/e9d1e188-b3ff-4807-a57e-9bf290e10f22-kube-api-access-bcgdr\") pod \"swift-operator-controller-manager-6859f9b676-j5d2d\" (UID: \"e9d1e188-b3ff-4807-a57e-9bf290e10f22\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-j5d2d" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.525858 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-fcd7d9895-ngbtk"] Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.535364 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-fhd72" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.568098 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btwrk\" (UniqueName: \"kubernetes.io/projected/b8324f27-b72f-4ad9-adcb-82469098520a-kube-api-access-btwrk\") pod \"watcher-operator-controller-manager-fcd7d9895-ngbtk\" (UID: \"b8324f27-b72f-4ad9-adcb-82469098520a\") " pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-ngbtk" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.568209 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76qnj\" (UniqueName: \"kubernetes.io/projected/fcb0af6a-547c-4555-86f9-f0b390ae7ce3-kube-api-access-76qnj\") pod \"telemetry-operator-controller-manager-5db5cf686f-zpq8n\" (UID: \"fcb0af6a-547c-4555-86f9-f0b390ae7ce3\") " pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-zpq8n" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.568237 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwd76\" (UniqueName: \"kubernetes.io/projected/6f1973d7-94ab-4855-bfd4-91f1e677306f-kube-api-access-bwd76\") pod \"test-operator-controller-manager-5cd5cb47d7-ftzfn\" (UID: \"6f1973d7-94ab-4855-bfd4-91f1e677306f\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-ftzfn" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.572016 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-c7hxr" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.572500 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-pqrxt" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.583723 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-j5d2d" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.590624 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6977957f88-8kmrq"] Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.592605 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6977957f88-8kmrq" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.595348 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.595434 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-hdg22" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.598493 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76qnj\" (UniqueName: \"kubernetes.io/projected/fcb0af6a-547c-4555-86f9-f0b390ae7ce3-kube-api-access-76qnj\") pod \"telemetry-operator-controller-manager-5db5cf686f-zpq8n\" (UID: \"fcb0af6a-547c-4555-86f9-f0b390ae7ce3\") " pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-zpq8n" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.638149 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6977957f88-8kmrq"] Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.675143 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btwrk\" (UniqueName: \"kubernetes.io/projected/b8324f27-b72f-4ad9-adcb-82469098520a-kube-api-access-btwrk\") pod \"watcher-operator-controller-manager-fcd7d9895-ngbtk\" (UID: \"b8324f27-b72f-4ad9-adcb-82469098520a\") " pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-ngbtk" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.675248 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fbjj\" (UniqueName: \"kubernetes.io/projected/73dd9462-cd4d-40d8-a416-c8ed1ef328fb-kube-api-access-7fbjj\") pod \"openstack-operator-controller-manager-6977957f88-8kmrq\" (UID: \"73dd9462-cd4d-40d8-a416-c8ed1ef328fb\") " pod="openstack-operators/openstack-operator-controller-manager-6977957f88-8kmrq" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.675274 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/73dd9462-cd4d-40d8-a416-c8ed1ef328fb-cert\") pod \"openstack-operator-controller-manager-6977957f88-8kmrq\" (UID: \"73dd9462-cd4d-40d8-a416-c8ed1ef328fb\") " pod="openstack-operators/openstack-operator-controller-manager-6977957f88-8kmrq" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.675374 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwd76\" (UniqueName: \"kubernetes.io/projected/6f1973d7-94ab-4855-bfd4-91f1e677306f-kube-api-access-bwd76\") pod \"test-operator-controller-manager-5cd5cb47d7-ftzfn\" (UID: \"6f1973d7-94ab-4855-bfd4-91f1e677306f\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-ftzfn" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.690221 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-xqmql"] Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.691584 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-xqmql" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.693357 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-vhjz9" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.694817 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btwrk\" (UniqueName: \"kubernetes.io/projected/b8324f27-b72f-4ad9-adcb-82469098520a-kube-api-access-btwrk\") pod \"watcher-operator-controller-manager-fcd7d9895-ngbtk\" (UID: \"b8324f27-b72f-4ad9-adcb-82469098520a\") " pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-ngbtk" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.698896 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwd76\" (UniqueName: \"kubernetes.io/projected/6f1973d7-94ab-4855-bfd4-91f1e677306f-kube-api-access-bwd76\") pod \"test-operator-controller-manager-5cd5cb47d7-ftzfn\" (UID: \"6f1973d7-94ab-4855-bfd4-91f1e677306f\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-ftzfn" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.707222 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-xqmql"] Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.708413 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-ngbtk" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.776121 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k82sv\" (UniqueName: \"kubernetes.io/projected/77b9687d-958c-47ad-835e-160fc6214d72-kube-api-access-k82sv\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-xqmql\" (UID: \"77b9687d-958c-47ad-835e-160fc6214d72\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-xqmql" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.776212 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fbjj\" (UniqueName: \"kubernetes.io/projected/73dd9462-cd4d-40d8-a416-c8ed1ef328fb-kube-api-access-7fbjj\") pod \"openstack-operator-controller-manager-6977957f88-8kmrq\" (UID: \"73dd9462-cd4d-40d8-a416-c8ed1ef328fb\") " pod="openstack-operators/openstack-operator-controller-manager-6977957f88-8kmrq" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.776232 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/73dd9462-cd4d-40d8-a416-c8ed1ef328fb-cert\") pod \"openstack-operator-controller-manager-6977957f88-8kmrq\" (UID: \"73dd9462-cd4d-40d8-a416-c8ed1ef328fb\") " pod="openstack-operators/openstack-operator-controller-manager-6977957f88-8kmrq" Oct 03 14:57:54 crc kubenswrapper[4774]: E1003 14:57:54.776374 4774 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 03 14:57:54 crc kubenswrapper[4774]: E1003 14:57:54.776436 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73dd9462-cd4d-40d8-a416-c8ed1ef328fb-cert podName:73dd9462-cd4d-40d8-a416-c8ed1ef328fb nodeName:}" failed. No retries permitted until 2025-10-03 14:57:55.276421073 +0000 UTC m=+897.865624525 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/73dd9462-cd4d-40d8-a416-c8ed1ef328fb-cert") pod "openstack-operator-controller-manager-6977957f88-8kmrq" (UID: "73dd9462-cd4d-40d8-a416-c8ed1ef328fb") : secret "webhook-server-cert" not found Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.811660 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fbjj\" (UniqueName: \"kubernetes.io/projected/73dd9462-cd4d-40d8-a416-c8ed1ef328fb-kube-api-access-7fbjj\") pod \"openstack-operator-controller-manager-6977957f88-8kmrq\" (UID: \"73dd9462-cd4d-40d8-a416-c8ed1ef328fb\") " pod="openstack-operators/openstack-operator-controller-manager-6977957f88-8kmrq" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.882062 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k82sv\" (UniqueName: \"kubernetes.io/projected/77b9687d-958c-47ad-835e-160fc6214d72-kube-api-access-k82sv\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-xqmql\" (UID: \"77b9687d-958c-47ad-835e-160fc6214d72\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-xqmql" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.882433 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b82b2ba-da6d-4441-a194-4b47207b159a-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678chmhg\" (UID: \"2b82b2ba-da6d-4441-a194-4b47207b159a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678chmhg" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.886732 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b82b2ba-da6d-4441-a194-4b47207b159a-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678chmhg\" (UID: \"2b82b2ba-da6d-4441-a194-4b47207b159a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678chmhg" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.899273 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-zpq8n" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.899391 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k82sv\" (UniqueName: \"kubernetes.io/projected/77b9687d-958c-47ad-835e-160fc6214d72-kube-api-access-k82sv\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-xqmql\" (UID: \"77b9687d-958c-47ad-835e-160fc6214d72\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-xqmql" Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.981493 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79d68d6c85-6gckj"] Oct 03 14:57:54 crc kubenswrapper[4774]: I1003 14:57:54.986146 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-ftzfn" Oct 03 14:57:55 crc kubenswrapper[4774]: I1003 14:57:55.002433 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-846dff85b5-v5kkx"] Oct 03 14:57:55 crc kubenswrapper[4774]: I1003 14:57:55.056226 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-xqmql" Oct 03 14:57:55 crc kubenswrapper[4774]: I1003 14:57:55.131333 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678chmhg" Oct 03 14:57:55 crc kubenswrapper[4774]: I1003 14:57:55.262788 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6c675fb79f-b54rk"] Oct 03 14:57:55 crc kubenswrapper[4774]: I1003 14:57:55.286558 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-m2wmk"] Oct 03 14:57:55 crc kubenswrapper[4774]: I1003 14:57:55.294112 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/73dd9462-cd4d-40d8-a416-c8ed1ef328fb-cert\") pod \"openstack-operator-controller-manager-6977957f88-8kmrq\" (UID: \"73dd9462-cd4d-40d8-a416-c8ed1ef328fb\") " pod="openstack-operators/openstack-operator-controller-manager-6977957f88-8kmrq" Oct 03 14:57:55 crc kubenswrapper[4774]: E1003 14:57:55.294330 4774 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 03 14:57:55 crc kubenswrapper[4774]: E1003 14:57:55.294415 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73dd9462-cd4d-40d8-a416-c8ed1ef328fb-cert podName:73dd9462-cd4d-40d8-a416-c8ed1ef328fb nodeName:}" failed. No retries permitted until 2025-10-03 14:57:56.294395796 +0000 UTC m=+898.883599248 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/73dd9462-cd4d-40d8-a416-c8ed1ef328fb-cert") pod "openstack-operator-controller-manager-6977957f88-8kmrq" (UID: "73dd9462-cd4d-40d8-a416-c8ed1ef328fb") : secret "webhook-server-cert" not found Oct 03 14:57:55 crc kubenswrapper[4774]: W1003 14:57:55.307744 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cf018fb_edab_4e23_ad04_763ee25e1613.slice/crio-3d2de81183ae237bc7d84b3707bfef951375df0639fcf95bf8b61eacd4a1a7b1 WatchSource:0}: Error finding container 3d2de81183ae237bc7d84b3707bfef951375df0639fcf95bf8b61eacd4a1a7b1: Status 404 returned error can't find the container with id 3d2de81183ae237bc7d84b3707bfef951375df0639fcf95bf8b61eacd4a1a7b1 Oct 03 14:57:55 crc kubenswrapper[4774]: I1003 14:57:55.395029 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-mtjh7"] Oct 03 14:57:55 crc kubenswrapper[4774]: W1003 14:57:55.401168 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6edea7f2_581f_4f41_bdda_45e83dce680d.slice/crio-47a019314ebf7cf67693376ab7444065786ecd1380572a01068700b788277bf9 WatchSource:0}: Error finding container 47a019314ebf7cf67693376ab7444065786ecd1380572a01068700b788277bf9: Status 404 returned error can't find the container with id 47a019314ebf7cf67693376ab7444065786ecd1380572a01068700b788277bf9 Oct 03 14:57:55 crc kubenswrapper[4774]: I1003 14:57:55.407374 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-59d6cfdf45-84vwj"] Oct 03 14:57:55 crc kubenswrapper[4774]: W1003 14:57:55.416589 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c00d52a_acc5_4650_8b36_48faa90030a3.slice/crio-251661f32fba2dedadc3c9e19f9dc721dc8ed87d44dc407fc63fb29bfcd1d80c WatchSource:0}: Error finding container 251661f32fba2dedadc3c9e19f9dc721dc8ed87d44dc407fc63fb29bfcd1d80c: Status 404 returned error can't find the container with id 251661f32fba2dedadc3c9e19f9dc721dc8ed87d44dc407fc63fb29bfcd1d80c Oct 03 14:57:55 crc kubenswrapper[4774]: I1003 14:57:55.421995 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-599898f689-mhz7w"] Oct 03 14:57:55 crc kubenswrapper[4774]: I1003 14:57:55.433945 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6fd6854b49-24bvc"] Oct 03 14:57:55 crc kubenswrapper[4774]: W1003 14:57:55.434989 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10674e8a_5afd_45f7_af36_e9dbfaf2dba0.slice/crio-02b6f7a69e7053297e1492726d38a565e95884d31d710fb9d713491c61deff75 WatchSource:0}: Error finding container 02b6f7a69e7053297e1492726d38a565e95884d31d710fb9d713491c61deff75: Status 404 returned error can't find the container with id 02b6f7a69e7053297e1492726d38a565e95884d31d710fb9d713491c61deff75 Oct 03 14:57:55 crc kubenswrapper[4774]: W1003 14:57:55.438211 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecf9de8d_83d4_41cb_9b00_e3aeedfb93fd.slice/crio-51f7e1f337198cdf545979958b59cca4980c56ea7e8b1ae83c34388df831ca34 WatchSource:0}: Error finding container 51f7e1f337198cdf545979958b59cca4980c56ea7e8b1ae83c34388df831ca34: Status 404 returned error can't find the container with id 51f7e1f337198cdf545979958b59cca4980c56ea7e8b1ae83c34388df831ca34 Oct 03 14:57:55 crc kubenswrapper[4774]: I1003 14:57:55.439544 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-84bc9db6cc-2sg6f"] Oct 03 14:57:55 crc kubenswrapper[4774]: I1003 14:57:55.496628 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc3a311f-a6c2-40e4-aaae-549aa2395c57-cert\") pod \"infra-operator-controller-manager-5fbf469cd7-zhmr8\" (UID: \"bc3a311f-a6c2-40e4-aaae-549aa2395c57\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-zhmr8" Oct 03 14:57:55 crc kubenswrapper[4774]: I1003 14:57:55.508643 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc3a311f-a6c2-40e4-aaae-549aa2395c57-cert\") pod \"infra-operator-controller-manager-5fbf469cd7-zhmr8\" (UID: \"bc3a311f-a6c2-40e4-aaae-549aa2395c57\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-zhmr8" Oct 03 14:57:55 crc kubenswrapper[4774]: I1003 14:57:55.565783 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-zhmr8" Oct 03 14:57:55 crc kubenswrapper[4774]: I1003 14:57:55.608247 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-mtjh7" event={"ID":"6edea7f2-581f-4f41-bdda-45e83dce680d","Type":"ContainerStarted","Data":"47a019314ebf7cf67693376ab7444065786ecd1380572a01068700b788277bf9"} Oct 03 14:57:55 crc kubenswrapper[4774]: I1003 14:57:55.609598 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-2sg6f" event={"ID":"741c17eb-da65-4dce-abc6-7faa47d28004","Type":"ContainerStarted","Data":"9cd4ff2e9daf0056ff313a437c61271199775e2dfe5ddb5e3d0d3c6e610cd7a0"} Oct 03 14:57:55 crc kubenswrapper[4774]: I1003 14:57:55.610817 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-v5kkx" event={"ID":"29905a36-139e-4611-bc8e-0289dd1fa0b4","Type":"ContainerStarted","Data":"0b0f29a89b7336b9a1d15f7ef540feb7a25763512233c210693e335c9d998eb7"} Oct 03 14:57:55 crc kubenswrapper[4774]: I1003 14:57:55.612451 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-m2wmk" event={"ID":"4cf018fb-edab-4e23-ad04-763ee25e1613","Type":"ContainerStarted","Data":"3d2de81183ae237bc7d84b3707bfef951375df0639fcf95bf8b61eacd4a1a7b1"} Oct 03 14:57:55 crc kubenswrapper[4774]: I1003 14:57:55.613802 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-84vwj" event={"ID":"5c00d52a-acc5-4650-8b36-48faa90030a3","Type":"ContainerStarted","Data":"251661f32fba2dedadc3c9e19f9dc721dc8ed87d44dc407fc63fb29bfcd1d80c"} Oct 03 14:57:55 crc kubenswrapper[4774]: I1003 14:57:55.614788 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-599898f689-mhz7w" event={"ID":"ecf9de8d-83d4-41cb-9b00-e3aeedfb93fd","Type":"ContainerStarted","Data":"51f7e1f337198cdf545979958b59cca4980c56ea7e8b1ae83c34388df831ca34"} Oct 03 14:57:55 crc kubenswrapper[4774]: I1003 14:57:55.616084 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-b54rk" event={"ID":"cea60414-e959-4200-b3e5-e532d2136047","Type":"ContainerStarted","Data":"3578be7949ae6fbfd25fb6073c2547887e545ff9255c94625671cd963ff98a16"} Oct 03 14:57:55 crc kubenswrapper[4774]: I1003 14:57:55.617103 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-24bvc" event={"ID":"10674e8a-5afd-45f7-af36-e9dbfaf2dba0","Type":"ContainerStarted","Data":"02b6f7a69e7053297e1492726d38a565e95884d31d710fb9d713491c61deff75"} Oct 03 14:57:55 crc kubenswrapper[4774]: I1003 14:57:55.618088 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-6gckj" event={"ID":"4c3d1495-6568-44c2-9bd7-82256a4b5aab","Type":"ContainerStarted","Data":"3168f5fd0ba56a0a70f696e347c28b52ae5fd369220062bafdb8cb9337c3ada4"} Oct 03 14:57:55 crc kubenswrapper[4774]: I1003 14:57:55.786919 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5db5cf686f-zpq8n"] Oct 03 14:57:55 crc kubenswrapper[4774]: I1003 14:57:55.806800 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-ftzfn"] Oct 03 14:57:55 crc kubenswrapper[4774]: I1003 14:57:55.807742 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-xqmql"] Oct 03 14:57:55 crc kubenswrapper[4774]: W1003 14:57:55.817561 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcb0af6a_547c_4555_86f9_f0b390ae7ce3.slice/crio-88ca391a06bdf9b1aef0c5c776fc7ac20c954ee4b3cc045ad7d6ddd1c8e13b64 WatchSource:0}: Error finding container 88ca391a06bdf9b1aef0c5c776fc7ac20c954ee4b3cc045ad7d6ddd1c8e13b64: Status 404 returned error can't find the container with id 88ca391a06bdf9b1aef0c5c776fc7ac20c954ee4b3cc045ad7d6ddd1c8e13b64 Oct 03 14:57:55 crc kubenswrapper[4774]: I1003 14:57:55.819014 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6574bf987d-nw7mh"] Oct 03 14:57:55 crc kubenswrapper[4774]: I1003 14:57:55.832762 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7f55849f88-fw4zb"] Oct 03 14:57:55 crc kubenswrapper[4774]: I1003 14:57:55.851223 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-fcd7d9895-ngbtk"] Oct 03 14:57:55 crc kubenswrapper[4774]: I1003 14:57:55.854545 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-7d8bb7f44c-pqrxt"] Oct 03 14:57:55 crc kubenswrapper[4774]: I1003 14:57:55.862182 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-j5d2d"] Oct 03 14:57:55 crc kubenswrapper[4774]: I1003 14:57:55.862226 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-688db7b6c7-c7hxr"] Oct 03 14:57:55 crc kubenswrapper[4774]: I1003 14:57:55.869471 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-555c7456bd-fhd72"] Oct 03 14:57:55 crc kubenswrapper[4774]: I1003 14:57:55.871259 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6769b867d9-2tqv8"] Oct 03 14:57:55 crc kubenswrapper[4774]: I1003 14:57:55.920758 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678chmhg"] Oct 03 14:57:55 crc kubenswrapper[4774]: E1003 14:57:55.929250 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bcgdr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-6859f9b676-j5d2d_openstack-operators(e9d1e188-b3ff-4807-a57e-9bf290e10f22): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 14:57:55 crc kubenswrapper[4774]: E1003 14:57:55.929356 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:018151bd5ff830ec03c6b8e3d53cfb9456ca6e1e34793bdd4f7edd39a0146fa6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-btwrk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-fcd7d9895-ngbtk_openstack-operators(b8324f27-b72f-4ad9-adcb-82469098520a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 14:57:55 crc kubenswrapper[4774]: E1003 14:57:55.929400 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:5c6ab93b78bd20eb7f1736751a59c1eb33fb06351339563dbefe49ccaaff6e94,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8gsrj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-688db7b6c7-c7hxr_openstack-operators(14fc26c3-ab56-44a5-832c-55eaca43cc5c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 14:57:55 crc kubenswrapper[4774]: E1003 14:57:55.933038 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:516f76ed86dd34225e6d0309451c7886bb81ff69032ba28125ae4d0cec54bce7,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d644k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7f55849f88-fw4zb_openstack-operators(b32ca090-1129-4c77-a2b3-df9e51a35a48): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 14:57:55 crc kubenswrapper[4774]: E1003 14:57:55.935758 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:e9ff0784bffe5b9a6d1a77a1b8866dd26b8d0c54465707df1808f68caad93a95,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nlbns,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-6769b867d9-2tqv8_openstack-operators(02bdd1b6-4d8f-40ce-b0fe-449c738d5d0e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 14:57:55 crc kubenswrapper[4774]: E1003 14:57:55.937086 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:a82409e6d6a5554aad95acfe6fa4784e33de19a963eb8b1da1a80a3e6cf1ab55,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4q5gh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-555c7456bd-fhd72_openstack-operators(74855628-79e3-4300-8a8b-d05aeed1904b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 14:57:55 crc kubenswrapper[4774]: W1003 14:57:55.948384 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b82b2ba_da6d_4441_a194_4b47207b159a.slice/crio-6f3e1bef7665cd303c4a9e49193055721ab989d944ccc97ba32c7b2b648cd445 WatchSource:0}: Error finding container 6f3e1bef7665cd303c4a9e49193055721ab989d944ccc97ba32c7b2b648cd445: Status 404 returned error can't find the container with id 6f3e1bef7665cd303c4a9e49193055721ab989d944ccc97ba32c7b2b648cd445 Oct 03 14:57:55 crc kubenswrapper[4774]: E1003 14:57:55.958789 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:f50229c8a33fd581bccbe5f34bbaf3936c1b454802e755c9b48b40b76a8239ee,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7ljvh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-6f64c4d678chmhg_openstack-operators(2b82b2ba-da6d-4441-a194-4b47207b159a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 14:57:56 crc kubenswrapper[4774]: I1003 14:57:56.172036 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5fbf469cd7-zhmr8"] Oct 03 14:57:56 crc kubenswrapper[4774]: E1003 14:57:56.270202 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-j5d2d" podUID="e9d1e188-b3ff-4807-a57e-9bf290e10f22" Oct 03 14:57:56 crc kubenswrapper[4774]: I1003 14:57:56.319243 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/73dd9462-cd4d-40d8-a416-c8ed1ef328fb-cert\") pod \"openstack-operator-controller-manager-6977957f88-8kmrq\" (UID: \"73dd9462-cd4d-40d8-a416-c8ed1ef328fb\") " pod="openstack-operators/openstack-operator-controller-manager-6977957f88-8kmrq" Oct 03 14:57:56 crc kubenswrapper[4774]: E1003 14:57:56.322063 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-c7hxr" podUID="14fc26c3-ab56-44a5-832c-55eaca43cc5c" Oct 03 14:57:56 crc kubenswrapper[4774]: I1003 14:57:56.336176 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/73dd9462-cd4d-40d8-a416-c8ed1ef328fb-cert\") pod \"openstack-operator-controller-manager-6977957f88-8kmrq\" (UID: \"73dd9462-cd4d-40d8-a416-c8ed1ef328fb\") " pod="openstack-operators/openstack-operator-controller-manager-6977957f88-8kmrq" Oct 03 14:57:56 crc kubenswrapper[4774]: E1003 14:57:56.457303 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-ngbtk" podUID="b8324f27-b72f-4ad9-adcb-82469098520a" Oct 03 14:57:56 crc kubenswrapper[4774]: E1003 14:57:56.463712 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-fw4zb" podUID="b32ca090-1129-4c77-a2b3-df9e51a35a48" Oct 03 14:57:56 crc kubenswrapper[4774]: E1003 14:57:56.476494 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-2tqv8" podUID="02bdd1b6-4d8f-40ce-b0fe-449c738d5d0e" Oct 03 14:57:56 crc kubenswrapper[4774]: E1003 14:57:56.476710 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678chmhg" podUID="2b82b2ba-da6d-4441-a194-4b47207b159a" Oct 03 14:57:56 crc kubenswrapper[4774]: E1003 14:57:56.509301 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-fhd72" podUID="74855628-79e3-4300-8a8b-d05aeed1904b" Oct 03 14:57:56 crc kubenswrapper[4774]: I1003 14:57:56.534185 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6977957f88-8kmrq" Oct 03 14:57:56 crc kubenswrapper[4774]: I1003 14:57:56.647087 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-zhmr8" event={"ID":"bc3a311f-a6c2-40e4-aaae-549aa2395c57","Type":"ContainerStarted","Data":"13adf9e38286bbb1053e5a53b96c527357d59b39deb8fb59c0c9f9993ff4e0b4"} Oct 03 14:57:56 crc kubenswrapper[4774]: I1003 14:57:56.717075 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-xqmql" event={"ID":"77b9687d-958c-47ad-835e-160fc6214d72","Type":"ContainerStarted","Data":"faf657dbd7c1573b0bfd73f6d95b448eb34db9c4ae3bcee3c73d05a778e564e4"} Oct 03 14:57:56 crc kubenswrapper[4774]: I1003 14:57:56.730078 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-fhd72" event={"ID":"74855628-79e3-4300-8a8b-d05aeed1904b","Type":"ContainerStarted","Data":"3d4ca46b76efe070a532f79f577ed3de38b899925991eb344ef66d0efd46c24b"} Oct 03 14:57:56 crc kubenswrapper[4774]: I1003 14:57:56.730131 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-fhd72" event={"ID":"74855628-79e3-4300-8a8b-d05aeed1904b","Type":"ContainerStarted","Data":"6fd83713489ad30b583ab868628f1499c48d0d0ebfbf6e495c0a4845125fce8a"} Oct 03 14:57:56 crc kubenswrapper[4774]: E1003 14:57:56.732684 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:a82409e6d6a5554aad95acfe6fa4784e33de19a963eb8b1da1a80a3e6cf1ab55\\\"\"" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-fhd72" podUID="74855628-79e3-4300-8a8b-d05aeed1904b" Oct 03 14:57:56 crc kubenswrapper[4774]: I1003 14:57:56.735132 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-zpq8n" event={"ID":"fcb0af6a-547c-4555-86f9-f0b390ae7ce3","Type":"ContainerStarted","Data":"88ca391a06bdf9b1aef0c5c776fc7ac20c954ee4b3cc045ad7d6ddd1c8e13b64"} Oct 03 14:57:56 crc kubenswrapper[4774]: I1003 14:57:56.736677 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-pqrxt" event={"ID":"35aefd42-2274-451a-8526-fb99c1f72be0","Type":"ContainerStarted","Data":"bfa76390ed747bec190b18b8f8faebe7503c777e647de1e6e3bf7b66da9e46f8"} Oct 03 14:57:56 crc kubenswrapper[4774]: I1003 14:57:56.758689 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-nw7mh" event={"ID":"eaf48bda-c7ca-484b-8d8f-b195d011e8f9","Type":"ContainerStarted","Data":"2a538bd2817ca29c4b2bfc0299a0f362128f96737c92528f065b079c0154f495"} Oct 03 14:57:56 crc kubenswrapper[4774]: I1003 14:57:56.776954 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-fw4zb" event={"ID":"b32ca090-1129-4c77-a2b3-df9e51a35a48","Type":"ContainerStarted","Data":"becf9d0c93274a7778609834c9b3696e394eb657b406aa7aaebbf5cdcf603166"} Oct 03 14:57:56 crc kubenswrapper[4774]: I1003 14:57:56.777033 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-fw4zb" event={"ID":"b32ca090-1129-4c77-a2b3-df9e51a35a48","Type":"ContainerStarted","Data":"ceae53f404348211cda7ef5b9d5edebe1ad6a49ff2a2f29daf70180f1bda88c9"} Oct 03 14:57:56 crc kubenswrapper[4774]: E1003 14:57:56.778219 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:516f76ed86dd34225e6d0309451c7886bb81ff69032ba28125ae4d0cec54bce7\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-fw4zb" podUID="b32ca090-1129-4c77-a2b3-df9e51a35a48" Oct 03 14:57:56 crc kubenswrapper[4774]: I1003 14:57:56.779278 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-c7hxr" event={"ID":"14fc26c3-ab56-44a5-832c-55eaca43cc5c","Type":"ContainerStarted","Data":"e8ee9c3af2b35930ceca8727a1a964e230460f6a41d5e72c6b06c9e26c915b50"} Oct 03 14:57:56 crc kubenswrapper[4774]: I1003 14:57:56.779295 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-c7hxr" event={"ID":"14fc26c3-ab56-44a5-832c-55eaca43cc5c","Type":"ContainerStarted","Data":"58ec3cbab951628d5ed566d9dbc3014bacb755682c397383faafebdae1e3a441"} Oct 03 14:57:56 crc kubenswrapper[4774]: E1003 14:57:56.793162 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:5c6ab93b78bd20eb7f1736751a59c1eb33fb06351339563dbefe49ccaaff6e94\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-c7hxr" podUID="14fc26c3-ab56-44a5-832c-55eaca43cc5c" Oct 03 14:57:56 crc kubenswrapper[4774]: I1003 14:57:56.802480 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-ngbtk" event={"ID":"b8324f27-b72f-4ad9-adcb-82469098520a","Type":"ContainerStarted","Data":"7f6b640eff566655583707cba77f6724b7dbe5e08da15f28a05b8a84e9fb92bb"} Oct 03 14:57:56 crc kubenswrapper[4774]: I1003 14:57:56.802517 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-ngbtk" event={"ID":"b8324f27-b72f-4ad9-adcb-82469098520a","Type":"ContainerStarted","Data":"8cff18a2f6d9995eb6e0b3b6b82599722a3368f4bd4c1eb4c9b17ee8de7c5951"} Oct 03 14:57:56 crc kubenswrapper[4774]: E1003 14:57:56.803684 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:018151bd5ff830ec03c6b8e3d53cfb9456ca6e1e34793bdd4f7edd39a0146fa6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-ngbtk" podUID="b8324f27-b72f-4ad9-adcb-82469098520a" Oct 03 14:57:56 crc kubenswrapper[4774]: I1003 14:57:56.811903 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-ftzfn" event={"ID":"6f1973d7-94ab-4855-bfd4-91f1e677306f","Type":"ContainerStarted","Data":"736cbee771a1c9e2659e352ec0eec82f32b3634b90f633df9e78ccbe8131a343"} Oct 03 14:57:56 crc kubenswrapper[4774]: I1003 14:57:56.818531 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-j5d2d" event={"ID":"e9d1e188-b3ff-4807-a57e-9bf290e10f22","Type":"ContainerStarted","Data":"fe35de93b10eef0739669c85f35d29d7536b76ea5cfcd274fafc66a436a2c179"} Oct 03 14:57:56 crc kubenswrapper[4774]: I1003 14:57:56.818588 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-j5d2d" event={"ID":"e9d1e188-b3ff-4807-a57e-9bf290e10f22","Type":"ContainerStarted","Data":"23ea63c6d0a339eb16ebd88d7078ebd0c0c4d60363bf180f3e2a5a7fb3ed3cad"} Oct 03 14:57:56 crc kubenswrapper[4774]: E1003 14:57:56.820151 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-j5d2d" podUID="e9d1e188-b3ff-4807-a57e-9bf290e10f22" Oct 03 14:57:56 crc kubenswrapper[4774]: I1003 14:57:56.822386 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-2tqv8" event={"ID":"02bdd1b6-4d8f-40ce-b0fe-449c738d5d0e","Type":"ContainerStarted","Data":"29866e4ba28994472a76c79c544398660c7fa71af52338dbebe9abf83befd55b"} Oct 03 14:57:56 crc kubenswrapper[4774]: I1003 14:57:56.822457 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-2tqv8" event={"ID":"02bdd1b6-4d8f-40ce-b0fe-449c738d5d0e","Type":"ContainerStarted","Data":"b5f71378dbb888cd90852c76aab05f6da60a7efd01f4e3e26c96e19e45de3c35"} Oct 03 14:57:56 crc kubenswrapper[4774]: E1003 14:57:56.826978 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:e9ff0784bffe5b9a6d1a77a1b8866dd26b8d0c54465707df1808f68caad93a95\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-2tqv8" podUID="02bdd1b6-4d8f-40ce-b0fe-449c738d5d0e" Oct 03 14:57:56 crc kubenswrapper[4774]: I1003 14:57:56.839173 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678chmhg" event={"ID":"2b82b2ba-da6d-4441-a194-4b47207b159a","Type":"ContainerStarted","Data":"980e903b01e3dedaa19b4ee493826c26ace61545f2668eb636256d7d5d39e9a4"} Oct 03 14:57:56 crc kubenswrapper[4774]: I1003 14:57:56.839225 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678chmhg" event={"ID":"2b82b2ba-da6d-4441-a194-4b47207b159a","Type":"ContainerStarted","Data":"6f3e1bef7665cd303c4a9e49193055721ab989d944ccc97ba32c7b2b648cd445"} Oct 03 14:57:56 crc kubenswrapper[4774]: E1003 14:57:56.851050 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:f50229c8a33fd581bccbe5f34bbaf3936c1b454802e755c9b48b40b76a8239ee\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678chmhg" podUID="2b82b2ba-da6d-4441-a194-4b47207b159a" Oct 03 14:57:57 crc kubenswrapper[4774]: I1003 14:57:57.365298 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6977957f88-8kmrq"] Oct 03 14:57:57 crc kubenswrapper[4774]: W1003 14:57:57.391607 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73dd9462_cd4d_40d8_a416_c8ed1ef328fb.slice/crio-8b4960988dc89897c24b07b65122ffcef9b969b5f489f9a39337aba4301da7dc WatchSource:0}: Error finding container 8b4960988dc89897c24b07b65122ffcef9b969b5f489f9a39337aba4301da7dc: Status 404 returned error can't find the container with id 8b4960988dc89897c24b07b65122ffcef9b969b5f489f9a39337aba4301da7dc Oct 03 14:57:57 crc kubenswrapper[4774]: I1003 14:57:57.851665 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6977957f88-8kmrq" event={"ID":"73dd9462-cd4d-40d8-a416-c8ed1ef328fb","Type":"ContainerStarted","Data":"68f9799cc33d0dc9e9da6aee4f6ac9065303200ae65c11e5b32917f07eae7188"} Oct 03 14:57:57 crc kubenswrapper[4774]: I1003 14:57:57.851709 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6977957f88-8kmrq" event={"ID":"73dd9462-cd4d-40d8-a416-c8ed1ef328fb","Type":"ContainerStarted","Data":"8b4960988dc89897c24b07b65122ffcef9b969b5f489f9a39337aba4301da7dc"} Oct 03 14:57:57 crc kubenswrapper[4774]: E1003 14:57:57.857987 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:516f76ed86dd34225e6d0309451c7886bb81ff69032ba28125ae4d0cec54bce7\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-fw4zb" podUID="b32ca090-1129-4c77-a2b3-df9e51a35a48" Oct 03 14:57:57 crc kubenswrapper[4774]: E1003 14:57:57.858478 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-j5d2d" podUID="e9d1e188-b3ff-4807-a57e-9bf290e10f22" Oct 03 14:57:57 crc kubenswrapper[4774]: E1003 14:57:57.858473 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:5c6ab93b78bd20eb7f1736751a59c1eb33fb06351339563dbefe49ccaaff6e94\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-c7hxr" podUID="14fc26c3-ab56-44a5-832c-55eaca43cc5c" Oct 03 14:57:57 crc kubenswrapper[4774]: E1003 14:57:57.858599 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:f50229c8a33fd581bccbe5f34bbaf3936c1b454802e755c9b48b40b76a8239ee\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678chmhg" podUID="2b82b2ba-da6d-4441-a194-4b47207b159a" Oct 03 14:57:57 crc kubenswrapper[4774]: E1003 14:57:57.858605 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:a82409e6d6a5554aad95acfe6fa4784e33de19a963eb8b1da1a80a3e6cf1ab55\\\"\"" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-fhd72" podUID="74855628-79e3-4300-8a8b-d05aeed1904b" Oct 03 14:57:57 crc kubenswrapper[4774]: E1003 14:57:57.858632 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:e9ff0784bffe5b9a6d1a77a1b8866dd26b8d0c54465707df1808f68caad93a95\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-2tqv8" podUID="02bdd1b6-4d8f-40ce-b0fe-449c738d5d0e" Oct 03 14:57:57 crc kubenswrapper[4774]: E1003 14:57:57.858683 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:018151bd5ff830ec03c6b8e3d53cfb9456ca6e1e34793bdd4f7edd39a0146fa6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-ngbtk" podUID="b8324f27-b72f-4ad9-adcb-82469098520a" Oct 03 14:57:58 crc kubenswrapper[4774]: I1003 14:57:58.859420 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6977957f88-8kmrq" event={"ID":"73dd9462-cd4d-40d8-a416-c8ed1ef328fb","Type":"ContainerStarted","Data":"2bbd5518b5da2ee96a280ff81fa49ffe4bd7f23981600a3f11f507146f89f5bd"} Oct 03 14:57:58 crc kubenswrapper[4774]: I1003 14:57:58.860446 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6977957f88-8kmrq" Oct 03 14:57:58 crc kubenswrapper[4774]: I1003 14:57:58.888924 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6977957f88-8kmrq" podStartSLOduration=4.888904853 podStartE2EDuration="4.888904853s" podCreationTimestamp="2025-10-03 14:57:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:57:58.882035382 +0000 UTC m=+901.471238834" watchObservedRunningTime="2025-10-03 14:57:58.888904853 +0000 UTC m=+901.478108305" Oct 03 14:58:06 crc kubenswrapper[4774]: I1003 14:58:06.542960 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6977957f88-8kmrq" Oct 03 14:58:12 crc kubenswrapper[4774]: E1003 14:58:12.822017 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:110b885fe640ffdd8536e7da2a613677a6777e3d902e2ff15fa4d5968fe06c54" Oct 03 14:58:12 crc kubenswrapper[4774]: E1003 14:58:12.822751 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:110b885fe640ffdd8536e7da2a613677a6777e3d902e2ff15fa4d5968fe06c54,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ktjw4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-5c468bf4d4-mtjh7_openstack-operators(6edea7f2-581f-4f41-bdda-45e83dce680d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 14:58:13 crc kubenswrapper[4774]: E1003 14:58:13.177409 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-mtjh7" podUID="6edea7f2-581f-4f41-bdda-45e83dce680d" Oct 03 14:58:13 crc kubenswrapper[4774]: I1003 14:58:13.971715 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-v5kkx" event={"ID":"29905a36-139e-4611-bc8e-0289dd1fa0b4","Type":"ContainerStarted","Data":"7b376d1d2c1021a9392dcecaf0fa934f7bf2300f3b570891788e4f72eb01f15e"} Oct 03 14:58:13 crc kubenswrapper[4774]: I1003 14:58:13.981846 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-ftzfn" event={"ID":"6f1973d7-94ab-4855-bfd4-91f1e677306f","Type":"ContainerStarted","Data":"aba779a96deab47789d4fdb95226286fb964f59b7d6816d16b82287d6f8243ac"} Oct 03 14:58:13 crc kubenswrapper[4774]: I1003 14:58:13.981883 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-ftzfn" event={"ID":"6f1973d7-94ab-4855-bfd4-91f1e677306f","Type":"ContainerStarted","Data":"6f5f738cb8c611547a8c8de3dfaaf7b4c048c133534b36bf21f32d6588c92efb"} Oct 03 14:58:13 crc kubenswrapper[4774]: I1003 14:58:13.982717 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-ftzfn" Oct 03 14:58:13 crc kubenswrapper[4774]: I1003 14:58:13.984285 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-m2wmk" event={"ID":"4cf018fb-edab-4e23-ad04-763ee25e1613","Type":"ContainerStarted","Data":"268e38710fef9930de125f975b6854b4a1cf4f9fa94a8545d4a09703dc4ebaf5"} Oct 03 14:58:13 crc kubenswrapper[4774]: I1003 14:58:13.986429 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-6gckj" event={"ID":"4c3d1495-6568-44c2-9bd7-82256a4b5aab","Type":"ContainerStarted","Data":"c1e61c9fa8c52b240c2552b1cdebb0ccc0dcab01f24d8b28ddf3b774f7cf050d"} Oct 03 14:58:13 crc kubenswrapper[4774]: I1003 14:58:13.986450 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-6gckj" event={"ID":"4c3d1495-6568-44c2-9bd7-82256a4b5aab","Type":"ContainerStarted","Data":"bb4c715238d3f0b8f011b5e2b75bc9f6cc61dec18b6baf08b99d4959da3ba80c"} Oct 03 14:58:13 crc kubenswrapper[4774]: I1003 14:58:13.986909 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-6gckj" Oct 03 14:58:13 crc kubenswrapper[4774]: I1003 14:58:13.994801 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-84vwj" event={"ID":"5c00d52a-acc5-4650-8b36-48faa90030a3","Type":"ContainerStarted","Data":"01aba5496c5dea7c5fbffbc71e01eac512642a75e6a5fe04b6b4e4248fef5b62"} Oct 03 14:58:14 crc kubenswrapper[4774]: I1003 14:58:14.019643 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-ftzfn" podStartSLOduration=2.930855275 podStartE2EDuration="20.019625951s" podCreationTimestamp="2025-10-03 14:57:54 +0000 UTC" firstStartedPulling="2025-10-03 14:57:55.837792434 +0000 UTC m=+898.426995886" lastFinishedPulling="2025-10-03 14:58:12.92656311 +0000 UTC m=+915.515766562" observedRunningTime="2025-10-03 14:58:14.014941524 +0000 UTC m=+916.604144976" watchObservedRunningTime="2025-10-03 14:58:14.019625951 +0000 UTC m=+916.608829403" Oct 03 14:58:14 crc kubenswrapper[4774]: I1003 14:58:14.037419 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-b54rk" event={"ID":"cea60414-e959-4200-b3e5-e532d2136047","Type":"ContainerStarted","Data":"b2d56627a2de0206c2143ee68f41bd3bb2d4c5dfdf6dc34b6b78484e51e5e68f"} Oct 03 14:58:14 crc kubenswrapper[4774]: I1003 14:58:14.039362 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-zhmr8" event={"ID":"bc3a311f-a6c2-40e4-aaae-549aa2395c57","Type":"ContainerStarted","Data":"517e1c9e915fb3262a312647759c7b29ff8a8e9d475eb9dcd1e9c98fca830760"} Oct 03 14:58:14 crc kubenswrapper[4774]: I1003 14:58:14.052067 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-6gckj" podStartSLOduration=3.210396146 podStartE2EDuration="21.052052329s" podCreationTimestamp="2025-10-03 14:57:53 +0000 UTC" firstStartedPulling="2025-10-03 14:57:55.085861091 +0000 UTC m=+897.675064533" lastFinishedPulling="2025-10-03 14:58:12.927517264 +0000 UTC m=+915.516720716" observedRunningTime="2025-10-03 14:58:14.050127511 +0000 UTC m=+916.639330963" watchObservedRunningTime="2025-10-03 14:58:14.052052329 +0000 UTC m=+916.641255781" Oct 03 14:58:14 crc kubenswrapper[4774]: I1003 14:58:14.077025 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-24bvc" event={"ID":"10674e8a-5afd-45f7-af36-e9dbfaf2dba0","Type":"ContainerStarted","Data":"0ad93a9bef261be8026903d015f10f0368616ebfb0c1b7f01571cce49d0b551c"} Oct 03 14:58:14 crc kubenswrapper[4774]: I1003 14:58:14.110207 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-zpq8n" event={"ID":"fcb0af6a-547c-4555-86f9-f0b390ae7ce3","Type":"ContainerStarted","Data":"c14e6c16cd119028d5a5556b39bb736724219a9ccf9fefa3489e7f2c5f61d68b"} Oct 03 14:58:14 crc kubenswrapper[4774]: I1003 14:58:14.110257 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-zpq8n" event={"ID":"fcb0af6a-547c-4555-86f9-f0b390ae7ce3","Type":"ContainerStarted","Data":"d2fb531bbc5ee0d11520456405ce7a8205735ac6ae1aeba47659a94ae37511d7"} Oct 03 14:58:14 crc kubenswrapper[4774]: I1003 14:58:14.110309 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-zpq8n" Oct 03 14:58:14 crc kubenswrapper[4774]: I1003 14:58:14.122274 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-599898f689-mhz7w" event={"ID":"ecf9de8d-83d4-41cb-9b00-e3aeedfb93fd","Type":"ContainerStarted","Data":"b4d20296fa1813e08e1873c0fdd882f1cee81b967b543dac4667456a774ea3b0"} Oct 03 14:58:14 crc kubenswrapper[4774]: I1003 14:58:14.138982 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-zpq8n" podStartSLOduration=3.042931627 podStartE2EDuration="20.138961084s" podCreationTimestamp="2025-10-03 14:57:54 +0000 UTC" firstStartedPulling="2025-10-03 14:57:55.830685957 +0000 UTC m=+898.419889409" lastFinishedPulling="2025-10-03 14:58:12.926715414 +0000 UTC m=+915.515918866" observedRunningTime="2025-10-03 14:58:14.138693717 +0000 UTC m=+916.727897159" watchObservedRunningTime="2025-10-03 14:58:14.138961084 +0000 UTC m=+916.728164536" Oct 03 14:58:14 crc kubenswrapper[4774]: I1003 14:58:14.152816 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-mtjh7" event={"ID":"6edea7f2-581f-4f41-bdda-45e83dce680d","Type":"ContainerStarted","Data":"8d1daf296f6f762a52c339470cefe3260cdedcc6ca7a6a54a336a226cefe98c7"} Oct 03 14:58:14 crc kubenswrapper[4774]: E1003 14:58:14.163134 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:110b885fe640ffdd8536e7da2a613677a6777e3d902e2ff15fa4d5968fe06c54\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-mtjh7" podUID="6edea7f2-581f-4f41-bdda-45e83dce680d" Oct 03 14:58:14 crc kubenswrapper[4774]: I1003 14:58:14.183673 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-pqrxt" event={"ID":"35aefd42-2274-451a-8526-fb99c1f72be0","Type":"ContainerStarted","Data":"16c796c42475c4e93fbc8dcd87e634e9cc624a517e2c736333fcd48b21b1e54f"} Oct 03 14:58:14 crc kubenswrapper[4774]: I1003 14:58:14.185171 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-xqmql" event={"ID":"77b9687d-958c-47ad-835e-160fc6214d72","Type":"ContainerStarted","Data":"2648b48d4cebb58a5f43cc57e95c02c82c56624fe87aeb628325b4c008c5a9da"} Oct 03 14:58:14 crc kubenswrapper[4774]: I1003 14:58:14.213264 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-xqmql" podStartSLOduration=3.108292695 podStartE2EDuration="20.213244374s" podCreationTimestamp="2025-10-03 14:57:54 +0000 UTC" firstStartedPulling="2025-10-03 14:57:55.821708013 +0000 UTC m=+898.410911465" lastFinishedPulling="2025-10-03 14:58:12.926659682 +0000 UTC m=+915.515863144" observedRunningTime="2025-10-03 14:58:14.209728517 +0000 UTC m=+916.798931979" watchObservedRunningTime="2025-10-03 14:58:14.213244374 +0000 UTC m=+916.802447826" Oct 03 14:58:14 crc kubenswrapper[4774]: I1003 14:58:14.222952 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-2sg6f" event={"ID":"741c17eb-da65-4dce-abc6-7faa47d28004","Type":"ContainerStarted","Data":"e905057161091cddc1d189761dc5a885a83f95c01f21470ebe354404a87687a3"} Oct 03 14:58:14 crc kubenswrapper[4774]: I1003 14:58:14.259351 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-nw7mh" event={"ID":"eaf48bda-c7ca-484b-8d8f-b195d011e8f9","Type":"ContainerStarted","Data":"7a30d1fa921f28b09db41c83304c11c6a34cff7783d5f0b5daf2e7a68e0a6569"} Oct 03 14:58:14 crc kubenswrapper[4774]: I1003 14:58:14.259423 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-nw7mh" event={"ID":"eaf48bda-c7ca-484b-8d8f-b195d011e8f9","Type":"ContainerStarted","Data":"da481e90c363d5173cd0cbd19df96d68b8ddb7d00208a87d6b9117e9dc293f96"} Oct 03 14:58:14 crc kubenswrapper[4774]: I1003 14:58:14.260541 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-nw7mh" Oct 03 14:58:15 crc kubenswrapper[4774]: I1003 14:58:15.269306 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-599898f689-mhz7w" event={"ID":"ecf9de8d-83d4-41cb-9b00-e3aeedfb93fd","Type":"ContainerStarted","Data":"3acc41199923cc8374dcec3dcf60957636448f4531684068c288ddf8f330f463"} Oct 03 14:58:15 crc kubenswrapper[4774]: I1003 14:58:15.269742 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-599898f689-mhz7w" Oct 03 14:58:15 crc kubenswrapper[4774]: I1003 14:58:15.272778 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-b54rk" event={"ID":"cea60414-e959-4200-b3e5-e532d2136047","Type":"ContainerStarted","Data":"589308602ba5dfd0b4d8453a1cbae5fe7f20ad92218f4179fd33db51b196194a"} Oct 03 14:58:15 crc kubenswrapper[4774]: I1003 14:58:15.272901 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-b54rk" Oct 03 14:58:15 crc kubenswrapper[4774]: I1003 14:58:15.274877 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-pqrxt" event={"ID":"35aefd42-2274-451a-8526-fb99c1f72be0","Type":"ContainerStarted","Data":"7d610656d0838476f4718c96ca54e49c914e63510b0209f0b74dfb221708bb6a"} Oct 03 14:58:15 crc kubenswrapper[4774]: I1003 14:58:15.274955 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-pqrxt" Oct 03 14:58:15 crc kubenswrapper[4774]: I1003 14:58:15.280250 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-v5kkx" event={"ID":"29905a36-139e-4611-bc8e-0289dd1fa0b4","Type":"ContainerStarted","Data":"3d162d9d1d9f91de3cc71ce56dcbd3d2ead477401c3a34ea8ed0b4723a08c1ef"} Oct 03 14:58:15 crc kubenswrapper[4774]: I1003 14:58:15.280827 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-v5kkx" Oct 03 14:58:15 crc kubenswrapper[4774]: I1003 14:58:15.282389 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-2sg6f" event={"ID":"741c17eb-da65-4dce-abc6-7faa47d28004","Type":"ContainerStarted","Data":"33df482c72a7adb760eaced9745f3a195cf954b5eb42db6d1b11a6bf1ef968ba"} Oct 03 14:58:15 crc kubenswrapper[4774]: I1003 14:58:15.282726 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-2sg6f" Oct 03 14:58:15 crc kubenswrapper[4774]: I1003 14:58:15.289198 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-24bvc" event={"ID":"10674e8a-5afd-45f7-af36-e9dbfaf2dba0","Type":"ContainerStarted","Data":"c178e5d4d96a897efb83cbc1126b2f3d9011d7485aeb4c42189719c0468a422d"} Oct 03 14:58:15 crc kubenswrapper[4774]: I1003 14:58:15.289778 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-24bvc" Oct 03 14:58:15 crc kubenswrapper[4774]: I1003 14:58:15.295120 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-m2wmk" event={"ID":"4cf018fb-edab-4e23-ad04-763ee25e1613","Type":"ContainerStarted","Data":"dab11a2771a3e1b0035daad799b6b4aaa234744b1b1d40485a9da729f90b11de"} Oct 03 14:58:15 crc kubenswrapper[4774]: I1003 14:58:15.295883 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-m2wmk" Oct 03 14:58:15 crc kubenswrapper[4774]: I1003 14:58:15.297860 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-599898f689-mhz7w" podStartSLOduration=4.81542559 podStartE2EDuration="22.297838583s" podCreationTimestamp="2025-10-03 14:57:53 +0000 UTC" firstStartedPulling="2025-10-03 14:57:55.444829204 +0000 UTC m=+898.034032656" lastFinishedPulling="2025-10-03 14:58:12.927242197 +0000 UTC m=+915.516445649" observedRunningTime="2025-10-03 14:58:15.289996148 +0000 UTC m=+917.879199620" watchObservedRunningTime="2025-10-03 14:58:15.297838583 +0000 UTC m=+917.887042045" Oct 03 14:58:15 crc kubenswrapper[4774]: I1003 14:58:15.310629 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-nw7mh" podStartSLOduration=5.205134288 podStartE2EDuration="22.310612561s" podCreationTimestamp="2025-10-03 14:57:53 +0000 UTC" firstStartedPulling="2025-10-03 14:57:55.821354864 +0000 UTC m=+898.410558316" lastFinishedPulling="2025-10-03 14:58:12.926833107 +0000 UTC m=+915.516036589" observedRunningTime="2025-10-03 14:58:14.28532048 +0000 UTC m=+916.874523952" watchObservedRunningTime="2025-10-03 14:58:15.310612561 +0000 UTC m=+917.899816013" Oct 03 14:58:15 crc kubenswrapper[4774]: I1003 14:58:15.312922 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-2sg6f" podStartSLOduration=4.82785349 podStartE2EDuration="22.312914759s" podCreationTimestamp="2025-10-03 14:57:53 +0000 UTC" firstStartedPulling="2025-10-03 14:57:55.44146429 +0000 UTC m=+898.030667742" lastFinishedPulling="2025-10-03 14:58:12.926525559 +0000 UTC m=+915.515729011" observedRunningTime="2025-10-03 14:58:15.304664233 +0000 UTC m=+917.893867695" watchObservedRunningTime="2025-10-03 14:58:15.312914759 +0000 UTC m=+917.902118211" Oct 03 14:58:15 crc kubenswrapper[4774]: I1003 14:58:15.314290 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-84vwj" Oct 03 14:58:15 crc kubenswrapper[4774]: I1003 14:58:15.315739 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-84vwj" event={"ID":"5c00d52a-acc5-4650-8b36-48faa90030a3","Type":"ContainerStarted","Data":"178513118ca1cf8f95b84d481abb85cff676f0426127c6450bc9090f8f372d02"} Oct 03 14:58:15 crc kubenswrapper[4774]: I1003 14:58:15.317968 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-zhmr8" Oct 03 14:58:15 crc kubenswrapper[4774]: I1003 14:58:15.318115 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-zhmr8" event={"ID":"bc3a311f-a6c2-40e4-aaae-549aa2395c57","Type":"ContainerStarted","Data":"ff2d09c50f04e64eb4f0bff007a5f4ca7e5238fa51b0803012825807027ae422"} Oct 03 14:58:15 crc kubenswrapper[4774]: E1003 14:58:15.315802 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:110b885fe640ffdd8536e7da2a613677a6777e3d902e2ff15fa4d5968fe06c54\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-mtjh7" podUID="6edea7f2-581f-4f41-bdda-45e83dce680d" Oct 03 14:58:15 crc kubenswrapper[4774]: I1003 14:58:15.322749 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-pqrxt" podStartSLOduration=4.324790719 podStartE2EDuration="21.322732713s" podCreationTimestamp="2025-10-03 14:57:54 +0000 UTC" firstStartedPulling="2025-10-03 14:57:55.92877529 +0000 UTC m=+898.517978732" lastFinishedPulling="2025-10-03 14:58:12.926717264 +0000 UTC m=+915.515920726" observedRunningTime="2025-10-03 14:58:15.316428896 +0000 UTC m=+917.905632348" watchObservedRunningTime="2025-10-03 14:58:15.322732713 +0000 UTC m=+917.911936165" Oct 03 14:58:15 crc kubenswrapper[4774]: I1003 14:58:15.339725 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-b54rk" podStartSLOduration=4.686479038 podStartE2EDuration="22.339706336s" podCreationTimestamp="2025-10-03 14:57:53 +0000 UTC" firstStartedPulling="2025-10-03 14:57:55.274197793 +0000 UTC m=+897.863401235" lastFinishedPulling="2025-10-03 14:58:12.927425081 +0000 UTC m=+915.516628533" observedRunningTime="2025-10-03 14:58:15.330404324 +0000 UTC m=+917.919607786" watchObservedRunningTime="2025-10-03 14:58:15.339706336 +0000 UTC m=+917.928909778" Oct 03 14:58:15 crc kubenswrapper[4774]: I1003 14:58:15.356007 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-24bvc" podStartSLOduration=4.866690577 podStartE2EDuration="22.355984322s" podCreationTimestamp="2025-10-03 14:57:53 +0000 UTC" firstStartedPulling="2025-10-03 14:57:55.437210604 +0000 UTC m=+898.026414066" lastFinishedPulling="2025-10-03 14:58:12.926504339 +0000 UTC m=+915.515707811" observedRunningTime="2025-10-03 14:58:15.345654764 +0000 UTC m=+917.934858216" watchObservedRunningTime="2025-10-03 14:58:15.355984322 +0000 UTC m=+917.945187774" Oct 03 14:58:15 crc kubenswrapper[4774]: I1003 14:58:15.374095 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-v5kkx" podStartSLOduration=4.530681256 podStartE2EDuration="22.374079802s" podCreationTimestamp="2025-10-03 14:57:53 +0000 UTC" firstStartedPulling="2025-10-03 14:57:55.085480002 +0000 UTC m=+897.674683454" lastFinishedPulling="2025-10-03 14:58:12.928878508 +0000 UTC m=+915.518082000" observedRunningTime="2025-10-03 14:58:15.365128529 +0000 UTC m=+917.954331981" watchObservedRunningTime="2025-10-03 14:58:15.374079802 +0000 UTC m=+917.963283254" Oct 03 14:58:15 crc kubenswrapper[4774]: I1003 14:58:15.405045 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-m2wmk" podStartSLOduration=4.793464232 podStartE2EDuration="22.405029053s" podCreationTimestamp="2025-10-03 14:57:53 +0000 UTC" firstStartedPulling="2025-10-03 14:57:55.315143553 +0000 UTC m=+897.904347005" lastFinishedPulling="2025-10-03 14:58:12.926708344 +0000 UTC m=+915.515911826" observedRunningTime="2025-10-03 14:58:15.402738336 +0000 UTC m=+917.991941798" watchObservedRunningTime="2025-10-03 14:58:15.405029053 +0000 UTC m=+917.994232505" Oct 03 14:58:15 crc kubenswrapper[4774]: I1003 14:58:15.427397 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-84vwj" podStartSLOduration=4.920063937 podStartE2EDuration="22.427346139s" podCreationTimestamp="2025-10-03 14:57:53 +0000 UTC" firstStartedPulling="2025-10-03 14:57:55.418476548 +0000 UTC m=+898.007680000" lastFinishedPulling="2025-10-03 14:58:12.92575873 +0000 UTC m=+915.514962202" observedRunningTime="2025-10-03 14:58:15.420923149 +0000 UTC m=+918.010126601" watchObservedRunningTime="2025-10-03 14:58:15.427346139 +0000 UTC m=+918.016549591" Oct 03 14:58:15 crc kubenswrapper[4774]: I1003 14:58:15.434714 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-zhmr8" podStartSLOduration=5.788696407 podStartE2EDuration="22.434698563s" podCreationTimestamp="2025-10-03 14:57:53 +0000 UTC" firstStartedPulling="2025-10-03 14:57:56.280970634 +0000 UTC m=+898.870174086" lastFinishedPulling="2025-10-03 14:58:12.92697278 +0000 UTC m=+915.516176242" observedRunningTime="2025-10-03 14:58:15.433358089 +0000 UTC m=+918.022561561" watchObservedRunningTime="2025-10-03 14:58:15.434698563 +0000 UTC m=+918.023902015" Oct 03 14:58:19 crc kubenswrapper[4774]: I1003 14:58:19.338479 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-j5d2d" event={"ID":"e9d1e188-b3ff-4807-a57e-9bf290e10f22","Type":"ContainerStarted","Data":"864aea22e44837e56daa0f6bcca7cbca3158941f743ffc4816e722599d62d84c"} Oct 03 14:58:19 crc kubenswrapper[4774]: I1003 14:58:19.339018 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-j5d2d" Oct 03 14:58:19 crc kubenswrapper[4774]: I1003 14:58:19.339889 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-2tqv8" event={"ID":"02bdd1b6-4d8f-40ce-b0fe-449c738d5d0e","Type":"ContainerStarted","Data":"e6eaf1a97cf22b731ffcf457ff3fa87ba62829f0f9dbf016a9816bd5fbe29138"} Oct 03 14:58:19 crc kubenswrapper[4774]: I1003 14:58:19.340466 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-2tqv8" Oct 03 14:58:19 crc kubenswrapper[4774]: I1003 14:58:19.342045 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678chmhg" event={"ID":"2b82b2ba-da6d-4441-a194-4b47207b159a","Type":"ContainerStarted","Data":"13c27e3bcec7266080b4eb695ada041a3f14e47fe6d2536af765c9c71b2d3054"} Oct 03 14:58:19 crc kubenswrapper[4774]: I1003 14:58:19.342558 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678chmhg" Oct 03 14:58:19 crc kubenswrapper[4774]: I1003 14:58:19.344196 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-fhd72" event={"ID":"74855628-79e3-4300-8a8b-d05aeed1904b","Type":"ContainerStarted","Data":"4fd224e7b86e9524e4774b3db334815d3478aa4c1066f6bbd8e452f451202e70"} Oct 03 14:58:19 crc kubenswrapper[4774]: I1003 14:58:19.344541 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-fhd72" Oct 03 14:58:19 crc kubenswrapper[4774]: I1003 14:58:19.346072 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-fw4zb" event={"ID":"b32ca090-1129-4c77-a2b3-df9e51a35a48","Type":"ContainerStarted","Data":"08c28d2661ef786545d32fed8952fe581281edc6e37796e9a2f0e2478ab8d0ce"} Oct 03 14:58:19 crc kubenswrapper[4774]: I1003 14:58:19.346290 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-fw4zb" Oct 03 14:58:19 crc kubenswrapper[4774]: I1003 14:58:19.348775 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-ngbtk" event={"ID":"b8324f27-b72f-4ad9-adcb-82469098520a","Type":"ContainerStarted","Data":"292e93c83cd65b7cf6d42fa93416e7cdc715e3536f3c20196ba514ecd53d27cd"} Oct 03 14:58:19 crc kubenswrapper[4774]: I1003 14:58:19.349125 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-ngbtk" Oct 03 14:58:19 crc kubenswrapper[4774]: I1003 14:58:19.350520 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-c7hxr" event={"ID":"14fc26c3-ab56-44a5-832c-55eaca43cc5c","Type":"ContainerStarted","Data":"3e78545e985657af3a5b07753d7b747b32d97e12cf89542242fe7b0b72870efc"} Oct 03 14:58:19 crc kubenswrapper[4774]: I1003 14:58:19.350674 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-c7hxr" Oct 03 14:58:19 crc kubenswrapper[4774]: I1003 14:58:19.369468 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-j5d2d" podStartSLOduration=2.815121429 podStartE2EDuration="25.369450735s" podCreationTimestamp="2025-10-03 14:57:54 +0000 UTC" firstStartedPulling="2025-10-03 14:57:55.929139199 +0000 UTC m=+898.518342651" lastFinishedPulling="2025-10-03 14:58:18.483468515 +0000 UTC m=+921.072671957" observedRunningTime="2025-10-03 14:58:19.361471427 +0000 UTC m=+921.950674889" watchObservedRunningTime="2025-10-03 14:58:19.369450735 +0000 UTC m=+921.958654187" Oct 03 14:58:19 crc kubenswrapper[4774]: I1003 14:58:19.403926 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-ngbtk" podStartSLOduration=2.83916373 podStartE2EDuration="25.403911654s" podCreationTimestamp="2025-10-03 14:57:54 +0000 UTC" firstStartedPulling="2025-10-03 14:57:55.929303334 +0000 UTC m=+898.518506786" lastFinishedPulling="2025-10-03 14:58:18.494051228 +0000 UTC m=+921.083254710" observedRunningTime="2025-10-03 14:58:19.388247434 +0000 UTC m=+921.977450876" watchObservedRunningTime="2025-10-03 14:58:19.403911654 +0000 UTC m=+921.993115106" Oct 03 14:58:19 crc kubenswrapper[4774]: I1003 14:58:19.407454 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-fhd72" podStartSLOduration=3.863350342 podStartE2EDuration="26.407446932s" podCreationTimestamp="2025-10-03 14:57:53 +0000 UTC" firstStartedPulling="2025-10-03 14:57:55.936815311 +0000 UTC m=+898.526018763" lastFinishedPulling="2025-10-03 14:58:18.480911901 +0000 UTC m=+921.070115353" observedRunningTime="2025-10-03 14:58:19.401449403 +0000 UTC m=+921.990652845" watchObservedRunningTime="2025-10-03 14:58:19.407446932 +0000 UTC m=+921.996650384" Oct 03 14:58:19 crc kubenswrapper[4774]: I1003 14:58:19.424240 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678chmhg" podStartSLOduration=3.867065064 podStartE2EDuration="26.42422266s" podCreationTimestamp="2025-10-03 14:57:53 +0000 UTC" firstStartedPulling="2025-10-03 14:57:55.958242054 +0000 UTC m=+898.547445506" lastFinishedPulling="2025-10-03 14:58:18.51539965 +0000 UTC m=+921.104603102" observedRunningTime="2025-10-03 14:58:19.420178379 +0000 UTC m=+922.009381831" watchObservedRunningTime="2025-10-03 14:58:19.42422266 +0000 UTC m=+922.013426112" Oct 03 14:58:19 crc kubenswrapper[4774]: I1003 14:58:19.436525 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-2tqv8" podStartSLOduration=3.87453854 podStartE2EDuration="26.436509296s" podCreationTimestamp="2025-10-03 14:57:53 +0000 UTC" firstStartedPulling="2025-10-03 14:57:55.9355989 +0000 UTC m=+898.524802342" lastFinishedPulling="2025-10-03 14:58:18.497569646 +0000 UTC m=+921.086773098" observedRunningTime="2025-10-03 14:58:19.434453865 +0000 UTC m=+922.023657327" watchObservedRunningTime="2025-10-03 14:58:19.436509296 +0000 UTC m=+922.025712748" Oct 03 14:58:19 crc kubenswrapper[4774]: I1003 14:58:19.450138 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-c7hxr" podStartSLOduration=3.898444296 podStartE2EDuration="26.450116005s" podCreationTimestamp="2025-10-03 14:57:53 +0000 UTC" firstStartedPulling="2025-10-03 14:57:55.929266533 +0000 UTC m=+898.518469985" lastFinishedPulling="2025-10-03 14:58:18.480938212 +0000 UTC m=+921.070141694" observedRunningTime="2025-10-03 14:58:19.450030793 +0000 UTC m=+922.039234245" watchObservedRunningTime="2025-10-03 14:58:19.450116005 +0000 UTC m=+922.039319457" Oct 03 14:58:19 crc kubenswrapper[4774]: I1003 14:58:19.470829 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-fw4zb" podStartSLOduration=3.921826299 podStartE2EDuration="26.470809641s" podCreationTimestamp="2025-10-03 14:57:53 +0000 UTC" firstStartedPulling="2025-10-03 14:57:55.932955045 +0000 UTC m=+898.522158497" lastFinishedPulling="2025-10-03 14:58:18.481938387 +0000 UTC m=+921.071141839" observedRunningTime="2025-10-03 14:58:19.470148264 +0000 UTC m=+922.059351716" watchObservedRunningTime="2025-10-03 14:58:19.470809641 +0000 UTC m=+922.060013093" Oct 03 14:58:23 crc kubenswrapper[4774]: I1003 14:58:23.957435 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-b54rk" Oct 03 14:58:23 crc kubenswrapper[4774]: I1003 14:58:23.974083 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-6gckj" Oct 03 14:58:23 crc kubenswrapper[4774]: I1003 14:58:23.995906 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-m2wmk" Oct 03 14:58:24 crc kubenswrapper[4774]: I1003 14:58:24.022927 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-v5kkx" Oct 03 14:58:24 crc kubenswrapper[4774]: I1003 14:58:24.132203 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-24bvc" Oct 03 14:58:24 crc kubenswrapper[4774]: I1003 14:58:24.132841 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-2sg6f" Oct 03 14:58:24 crc kubenswrapper[4774]: I1003 14:58:24.273081 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-fw4zb" Oct 03 14:58:24 crc kubenswrapper[4774]: I1003 14:58:24.319417 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-599898f689-mhz7w" Oct 03 14:58:24 crc kubenswrapper[4774]: I1003 14:58:24.351533 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-2tqv8" Oct 03 14:58:24 crc kubenswrapper[4774]: I1003 14:58:24.466355 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-84vwj" Oct 03 14:58:24 crc kubenswrapper[4774]: I1003 14:58:24.493994 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-nw7mh" Oct 03 14:58:24 crc kubenswrapper[4774]: I1003 14:58:24.538836 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-fhd72" Oct 03 14:58:24 crc kubenswrapper[4774]: I1003 14:58:24.575662 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-pqrxt" Oct 03 14:58:24 crc kubenswrapper[4774]: I1003 14:58:24.578158 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-c7hxr" Oct 03 14:58:24 crc kubenswrapper[4774]: I1003 14:58:24.587354 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-j5d2d" Oct 03 14:58:24 crc kubenswrapper[4774]: I1003 14:58:24.712198 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-ngbtk" Oct 03 14:58:24 crc kubenswrapper[4774]: I1003 14:58:24.903462 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-zpq8n" Oct 03 14:58:24 crc kubenswrapper[4774]: I1003 14:58:24.989115 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-ftzfn" Oct 03 14:58:25 crc kubenswrapper[4774]: I1003 14:58:25.138792 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678chmhg" Oct 03 14:58:25 crc kubenswrapper[4774]: I1003 14:58:25.576559 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-zhmr8" Oct 03 14:58:29 crc kubenswrapper[4774]: I1003 14:58:29.432295 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-mtjh7" event={"ID":"6edea7f2-581f-4f41-bdda-45e83dce680d","Type":"ContainerStarted","Data":"0d28edc2b4fac86ab41f3a17c7cde84fdc8abf564a186711af7ad078a4c463e2"} Oct 03 14:58:29 crc kubenswrapper[4774]: I1003 14:58:29.433263 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-mtjh7" Oct 03 14:58:29 crc kubenswrapper[4774]: I1003 14:58:29.460199 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-mtjh7" podStartSLOduration=3.071512026 podStartE2EDuration="36.460171916s" podCreationTimestamp="2025-10-03 14:57:53 +0000 UTC" firstStartedPulling="2025-10-03 14:57:55.403107605 +0000 UTC m=+897.992311057" lastFinishedPulling="2025-10-03 14:58:28.791767455 +0000 UTC m=+931.380970947" observedRunningTime="2025-10-03 14:58:29.452876304 +0000 UTC m=+932.042079796" watchObservedRunningTime="2025-10-03 14:58:29.460171916 +0000 UTC m=+932.049375418" Oct 03 14:58:34 crc kubenswrapper[4774]: I1003 14:58:34.139409 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-mtjh7" Oct 03 14:58:48 crc kubenswrapper[4774]: I1003 14:58:48.974556 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-tx545"] Oct 03 14:58:48 crc kubenswrapper[4774]: I1003 14:58:48.976125 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-tx545" Oct 03 14:58:48 crc kubenswrapper[4774]: I1003 14:58:48.977840 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 03 14:58:48 crc kubenswrapper[4774]: I1003 14:58:48.978179 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-vpkqx" Oct 03 14:58:48 crc kubenswrapper[4774]: I1003 14:58:48.978261 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 03 14:58:48 crc kubenswrapper[4774]: I1003 14:58:48.978317 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 03 14:58:48 crc kubenswrapper[4774]: I1003 14:58:48.989612 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-tx545"] Oct 03 14:58:49 crc kubenswrapper[4774]: I1003 14:58:49.067425 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3d2fe24-39c4-492e-a220-d46d06b1b587-config\") pod \"dnsmasq-dns-675f4bcbfc-tx545\" (UID: \"f3d2fe24-39c4-492e-a220-d46d06b1b587\") " pod="openstack/dnsmasq-dns-675f4bcbfc-tx545" Oct 03 14:58:49 crc kubenswrapper[4774]: I1003 14:58:49.067482 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5sqf\" (UniqueName: \"kubernetes.io/projected/f3d2fe24-39c4-492e-a220-d46d06b1b587-kube-api-access-q5sqf\") pod \"dnsmasq-dns-675f4bcbfc-tx545\" (UID: \"f3d2fe24-39c4-492e-a220-d46d06b1b587\") " pod="openstack/dnsmasq-dns-675f4bcbfc-tx545" Oct 03 14:58:49 crc kubenswrapper[4774]: I1003 14:58:49.081329 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-xfhvv"] Oct 03 14:58:49 crc kubenswrapper[4774]: I1003 14:58:49.082472 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-xfhvv" Oct 03 14:58:49 crc kubenswrapper[4774]: I1003 14:58:49.085695 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 03 14:58:49 crc kubenswrapper[4774]: I1003 14:58:49.101056 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-xfhvv"] Oct 03 14:58:49 crc kubenswrapper[4774]: I1003 14:58:49.168761 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59r49\" (UniqueName: \"kubernetes.io/projected/fd4951b7-0996-49e5-9e11-b096eea903b5-kube-api-access-59r49\") pod \"dnsmasq-dns-78dd6ddcc-xfhvv\" (UID: \"fd4951b7-0996-49e5-9e11-b096eea903b5\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xfhvv" Oct 03 14:58:49 crc kubenswrapper[4774]: I1003 14:58:49.168796 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd4951b7-0996-49e5-9e11-b096eea903b5-config\") pod \"dnsmasq-dns-78dd6ddcc-xfhvv\" (UID: \"fd4951b7-0996-49e5-9e11-b096eea903b5\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xfhvv" Oct 03 14:58:49 crc kubenswrapper[4774]: I1003 14:58:49.168925 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3d2fe24-39c4-492e-a220-d46d06b1b587-config\") pod \"dnsmasq-dns-675f4bcbfc-tx545\" (UID: \"f3d2fe24-39c4-492e-a220-d46d06b1b587\") " pod="openstack/dnsmasq-dns-675f4bcbfc-tx545" Oct 03 14:58:49 crc kubenswrapper[4774]: I1003 14:58:49.168975 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5sqf\" (UniqueName: \"kubernetes.io/projected/f3d2fe24-39c4-492e-a220-d46d06b1b587-kube-api-access-q5sqf\") pod \"dnsmasq-dns-675f4bcbfc-tx545\" (UID: \"f3d2fe24-39c4-492e-a220-d46d06b1b587\") " pod="openstack/dnsmasq-dns-675f4bcbfc-tx545" Oct 03 14:58:49 crc kubenswrapper[4774]: I1003 14:58:49.169000 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd4951b7-0996-49e5-9e11-b096eea903b5-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-xfhvv\" (UID: \"fd4951b7-0996-49e5-9e11-b096eea903b5\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xfhvv" Oct 03 14:58:49 crc kubenswrapper[4774]: I1003 14:58:49.170098 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3d2fe24-39c4-492e-a220-d46d06b1b587-config\") pod \"dnsmasq-dns-675f4bcbfc-tx545\" (UID: \"f3d2fe24-39c4-492e-a220-d46d06b1b587\") " pod="openstack/dnsmasq-dns-675f4bcbfc-tx545" Oct 03 14:58:49 crc kubenswrapper[4774]: I1003 14:58:49.191318 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5sqf\" (UniqueName: \"kubernetes.io/projected/f3d2fe24-39c4-492e-a220-d46d06b1b587-kube-api-access-q5sqf\") pod \"dnsmasq-dns-675f4bcbfc-tx545\" (UID: \"f3d2fe24-39c4-492e-a220-d46d06b1b587\") " pod="openstack/dnsmasq-dns-675f4bcbfc-tx545" Oct 03 14:58:49 crc kubenswrapper[4774]: I1003 14:58:49.270575 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd4951b7-0996-49e5-9e11-b096eea903b5-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-xfhvv\" (UID: \"fd4951b7-0996-49e5-9e11-b096eea903b5\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xfhvv" Oct 03 14:58:49 crc kubenswrapper[4774]: I1003 14:58:49.270648 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59r49\" (UniqueName: \"kubernetes.io/projected/fd4951b7-0996-49e5-9e11-b096eea903b5-kube-api-access-59r49\") pod \"dnsmasq-dns-78dd6ddcc-xfhvv\" (UID: \"fd4951b7-0996-49e5-9e11-b096eea903b5\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xfhvv" Oct 03 14:58:49 crc kubenswrapper[4774]: I1003 14:58:49.270675 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd4951b7-0996-49e5-9e11-b096eea903b5-config\") pod \"dnsmasq-dns-78dd6ddcc-xfhvv\" (UID: \"fd4951b7-0996-49e5-9e11-b096eea903b5\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xfhvv" Oct 03 14:58:49 crc kubenswrapper[4774]: I1003 14:58:49.271664 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd4951b7-0996-49e5-9e11-b096eea903b5-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-xfhvv\" (UID: \"fd4951b7-0996-49e5-9e11-b096eea903b5\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xfhvv" Oct 03 14:58:49 crc kubenswrapper[4774]: I1003 14:58:49.271690 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd4951b7-0996-49e5-9e11-b096eea903b5-config\") pod \"dnsmasq-dns-78dd6ddcc-xfhvv\" (UID: \"fd4951b7-0996-49e5-9e11-b096eea903b5\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xfhvv" Oct 03 14:58:49 crc kubenswrapper[4774]: I1003 14:58:49.288512 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59r49\" (UniqueName: \"kubernetes.io/projected/fd4951b7-0996-49e5-9e11-b096eea903b5-kube-api-access-59r49\") pod \"dnsmasq-dns-78dd6ddcc-xfhvv\" (UID: \"fd4951b7-0996-49e5-9e11-b096eea903b5\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xfhvv" Oct 03 14:58:49 crc kubenswrapper[4774]: I1003 14:58:49.299385 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-tx545" Oct 03 14:58:49 crc kubenswrapper[4774]: I1003 14:58:49.397897 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-xfhvv" Oct 03 14:58:49 crc kubenswrapper[4774]: I1003 14:58:49.743200 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-tx545"] Oct 03 14:58:49 crc kubenswrapper[4774]: I1003 14:58:49.747681 4774 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 14:58:49 crc kubenswrapper[4774]: I1003 14:58:49.834345 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-xfhvv"] Oct 03 14:58:49 crc kubenswrapper[4774]: W1003 14:58:49.845897 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd4951b7_0996_49e5_9e11_b096eea903b5.slice/crio-5fe04dce08204570773d31601b1ed982973e155bb34b08f630d8c8befb5d55fc WatchSource:0}: Error finding container 5fe04dce08204570773d31601b1ed982973e155bb34b08f630d8c8befb5d55fc: Status 404 returned error can't find the container with id 5fe04dce08204570773d31601b1ed982973e155bb34b08f630d8c8befb5d55fc Oct 03 14:58:50 crc kubenswrapper[4774]: I1003 14:58:50.623341 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-tx545" event={"ID":"f3d2fe24-39c4-492e-a220-d46d06b1b587","Type":"ContainerStarted","Data":"b80de6ed46de55a7e237a3e75653f3e4b8a6910f5186b4209084e8a32c99cc13"} Oct 03 14:58:50 crc kubenswrapper[4774]: I1003 14:58:50.625068 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-xfhvv" event={"ID":"fd4951b7-0996-49e5-9e11-b096eea903b5","Type":"ContainerStarted","Data":"5fe04dce08204570773d31601b1ed982973e155bb34b08f630d8c8befb5d55fc"} Oct 03 14:58:50 crc kubenswrapper[4774]: I1003 14:58:50.653228 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:58:50 crc kubenswrapper[4774]: I1003 14:58:50.653283 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:58:52 crc kubenswrapper[4774]: I1003 14:58:52.159214 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-tx545"] Oct 03 14:58:52 crc kubenswrapper[4774]: I1003 14:58:52.185555 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-cmtxq"] Oct 03 14:58:52 crc kubenswrapper[4774]: I1003 14:58:52.186897 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-cmtxq" Oct 03 14:58:52 crc kubenswrapper[4774]: I1003 14:58:52.202474 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-cmtxq"] Oct 03 14:58:52 crc kubenswrapper[4774]: I1003 14:58:52.315356 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b91a9c71-b83c-4e14-957f-820cba700771-config\") pod \"dnsmasq-dns-5ccc8479f9-cmtxq\" (UID: \"b91a9c71-b83c-4e14-957f-820cba700771\") " pod="openstack/dnsmasq-dns-5ccc8479f9-cmtxq" Oct 03 14:58:52 crc kubenswrapper[4774]: I1003 14:58:52.315456 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5446\" (UniqueName: \"kubernetes.io/projected/b91a9c71-b83c-4e14-957f-820cba700771-kube-api-access-r5446\") pod \"dnsmasq-dns-5ccc8479f9-cmtxq\" (UID: \"b91a9c71-b83c-4e14-957f-820cba700771\") " pod="openstack/dnsmasq-dns-5ccc8479f9-cmtxq" Oct 03 14:58:52 crc kubenswrapper[4774]: I1003 14:58:52.315507 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b91a9c71-b83c-4e14-957f-820cba700771-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-cmtxq\" (UID: \"b91a9c71-b83c-4e14-957f-820cba700771\") " pod="openstack/dnsmasq-dns-5ccc8479f9-cmtxq" Oct 03 14:58:52 crc kubenswrapper[4774]: I1003 14:58:52.419252 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b91a9c71-b83c-4e14-957f-820cba700771-config\") pod \"dnsmasq-dns-5ccc8479f9-cmtxq\" (UID: \"b91a9c71-b83c-4e14-957f-820cba700771\") " pod="openstack/dnsmasq-dns-5ccc8479f9-cmtxq" Oct 03 14:58:52 crc kubenswrapper[4774]: I1003 14:58:52.419347 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5446\" (UniqueName: \"kubernetes.io/projected/b91a9c71-b83c-4e14-957f-820cba700771-kube-api-access-r5446\") pod \"dnsmasq-dns-5ccc8479f9-cmtxq\" (UID: \"b91a9c71-b83c-4e14-957f-820cba700771\") " pod="openstack/dnsmasq-dns-5ccc8479f9-cmtxq" Oct 03 14:58:52 crc kubenswrapper[4774]: I1003 14:58:52.419453 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b91a9c71-b83c-4e14-957f-820cba700771-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-cmtxq\" (UID: \"b91a9c71-b83c-4e14-957f-820cba700771\") " pod="openstack/dnsmasq-dns-5ccc8479f9-cmtxq" Oct 03 14:58:52 crc kubenswrapper[4774]: I1003 14:58:52.420269 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b91a9c71-b83c-4e14-957f-820cba700771-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-cmtxq\" (UID: \"b91a9c71-b83c-4e14-957f-820cba700771\") " pod="openstack/dnsmasq-dns-5ccc8479f9-cmtxq" Oct 03 14:58:52 crc kubenswrapper[4774]: I1003 14:58:52.420648 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b91a9c71-b83c-4e14-957f-820cba700771-config\") pod \"dnsmasq-dns-5ccc8479f9-cmtxq\" (UID: \"b91a9c71-b83c-4e14-957f-820cba700771\") " pod="openstack/dnsmasq-dns-5ccc8479f9-cmtxq" Oct 03 14:58:52 crc kubenswrapper[4774]: I1003 14:58:52.451173 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5446\" (UniqueName: \"kubernetes.io/projected/b91a9c71-b83c-4e14-957f-820cba700771-kube-api-access-r5446\") pod \"dnsmasq-dns-5ccc8479f9-cmtxq\" (UID: \"b91a9c71-b83c-4e14-957f-820cba700771\") " pod="openstack/dnsmasq-dns-5ccc8479f9-cmtxq" Oct 03 14:58:52 crc kubenswrapper[4774]: I1003 14:58:52.490435 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-xfhvv"] Oct 03 14:58:52 crc kubenswrapper[4774]: I1003 14:58:52.509333 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-cmtxq" Oct 03 14:58:52 crc kubenswrapper[4774]: I1003 14:58:52.525716 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-h8wlr"] Oct 03 14:58:52 crc kubenswrapper[4774]: I1003 14:58:52.533109 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-h8wlr" Oct 03 14:58:52 crc kubenswrapper[4774]: I1003 14:58:52.542794 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-h8wlr"] Oct 03 14:58:52 crc kubenswrapper[4774]: I1003 14:58:52.622241 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea92e131-6368-496f-b7ef-3affd118bb5a-config\") pod \"dnsmasq-dns-57d769cc4f-h8wlr\" (UID: \"ea92e131-6368-496f-b7ef-3affd118bb5a\") " pod="openstack/dnsmasq-dns-57d769cc4f-h8wlr" Oct 03 14:58:52 crc kubenswrapper[4774]: I1003 14:58:52.622292 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea92e131-6368-496f-b7ef-3affd118bb5a-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-h8wlr\" (UID: \"ea92e131-6368-496f-b7ef-3affd118bb5a\") " pod="openstack/dnsmasq-dns-57d769cc4f-h8wlr" Oct 03 14:58:52 crc kubenswrapper[4774]: I1003 14:58:52.622385 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgd8k\" (UniqueName: \"kubernetes.io/projected/ea92e131-6368-496f-b7ef-3affd118bb5a-kube-api-access-xgd8k\") pod \"dnsmasq-dns-57d769cc4f-h8wlr\" (UID: \"ea92e131-6368-496f-b7ef-3affd118bb5a\") " pod="openstack/dnsmasq-dns-57d769cc4f-h8wlr" Oct 03 14:58:52 crc kubenswrapper[4774]: I1003 14:58:52.723206 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgd8k\" (UniqueName: \"kubernetes.io/projected/ea92e131-6368-496f-b7ef-3affd118bb5a-kube-api-access-xgd8k\") pod \"dnsmasq-dns-57d769cc4f-h8wlr\" (UID: \"ea92e131-6368-496f-b7ef-3affd118bb5a\") " pod="openstack/dnsmasq-dns-57d769cc4f-h8wlr" Oct 03 14:58:52 crc kubenswrapper[4774]: I1003 14:58:52.723269 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea92e131-6368-496f-b7ef-3affd118bb5a-config\") pod \"dnsmasq-dns-57d769cc4f-h8wlr\" (UID: \"ea92e131-6368-496f-b7ef-3affd118bb5a\") " pod="openstack/dnsmasq-dns-57d769cc4f-h8wlr" Oct 03 14:58:52 crc kubenswrapper[4774]: I1003 14:58:52.723296 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea92e131-6368-496f-b7ef-3affd118bb5a-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-h8wlr\" (UID: \"ea92e131-6368-496f-b7ef-3affd118bb5a\") " pod="openstack/dnsmasq-dns-57d769cc4f-h8wlr" Oct 03 14:58:52 crc kubenswrapper[4774]: I1003 14:58:52.724194 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea92e131-6368-496f-b7ef-3affd118bb5a-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-h8wlr\" (UID: \"ea92e131-6368-496f-b7ef-3affd118bb5a\") " pod="openstack/dnsmasq-dns-57d769cc4f-h8wlr" Oct 03 14:58:52 crc kubenswrapper[4774]: I1003 14:58:52.724273 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea92e131-6368-496f-b7ef-3affd118bb5a-config\") pod \"dnsmasq-dns-57d769cc4f-h8wlr\" (UID: \"ea92e131-6368-496f-b7ef-3affd118bb5a\") " pod="openstack/dnsmasq-dns-57d769cc4f-h8wlr" Oct 03 14:58:52 crc kubenswrapper[4774]: I1003 14:58:52.753121 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgd8k\" (UniqueName: \"kubernetes.io/projected/ea92e131-6368-496f-b7ef-3affd118bb5a-kube-api-access-xgd8k\") pod \"dnsmasq-dns-57d769cc4f-h8wlr\" (UID: \"ea92e131-6368-496f-b7ef-3affd118bb5a\") " pod="openstack/dnsmasq-dns-57d769cc4f-h8wlr" Oct 03 14:58:52 crc kubenswrapper[4774]: I1003 14:58:52.905836 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-h8wlr" Oct 03 14:58:52 crc kubenswrapper[4774]: I1003 14:58:52.995502 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-cmtxq"] Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.407808 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-h8wlr"] Oct 03 14:58:53 crc kubenswrapper[4774]: W1003 14:58:53.411021 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea92e131_6368_496f_b7ef_3affd118bb5a.slice/crio-a368f40bda172bcf0d79cf416ea332c24cfec7a4089cf2c810a08eafa1925d0d WatchSource:0}: Error finding container a368f40bda172bcf0d79cf416ea332c24cfec7a4089cf2c810a08eafa1925d0d: Status 404 returned error can't find the container with id a368f40bda172bcf0d79cf416ea332c24cfec7a4089cf2c810a08eafa1925d0d Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.458016 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.459581 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.465636 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.467145 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.467190 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.467442 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.467757 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-2nlxm" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.469445 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.469603 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.473867 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.594169 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0a0a516a-bd97-4484-802b-71eb14f3ca3f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.594239 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw276\" (UniqueName: \"kubernetes.io/projected/0a0a516a-bd97-4484-802b-71eb14f3ca3f-kube-api-access-vw276\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.594267 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.594305 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0a0a516a-bd97-4484-802b-71eb14f3ca3f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.594482 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0a0a516a-bd97-4484-802b-71eb14f3ca3f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.594555 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0a0a516a-bd97-4484-802b-71eb14f3ca3f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.594607 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0a0a516a-bd97-4484-802b-71eb14f3ca3f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.594640 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0a0a516a-bd97-4484-802b-71eb14f3ca3f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.594666 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0a0a516a-bd97-4484-802b-71eb14f3ca3f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.594759 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0a0a516a-bd97-4484-802b-71eb14f3ca3f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.594801 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0a0a516a-bd97-4484-802b-71eb14f3ca3f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.638508 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.640167 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.643235 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.643552 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.644118 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.645196 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-m6dkz" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.645506 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.647481 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.649836 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.656416 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-h8wlr" event={"ID":"ea92e131-6368-496f-b7ef-3affd118bb5a","Type":"ContainerStarted","Data":"a368f40bda172bcf0d79cf416ea332c24cfec7a4089cf2c810a08eafa1925d0d"} Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.661509 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-cmtxq" event={"ID":"b91a9c71-b83c-4e14-957f-820cba700771","Type":"ContainerStarted","Data":"d58d6367d94351bb56b0ff2e431904b9b441d7c7c4a46b9226a08a7d638883ae"} Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.668184 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.695753 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0a0a516a-bd97-4484-802b-71eb14f3ca3f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.695997 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0a0a516a-bd97-4484-802b-71eb14f3ca3f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.696029 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0a0a516a-bd97-4484-802b-71eb14f3ca3f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.696057 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0a0a516a-bd97-4484-802b-71eb14f3ca3f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.696083 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0a0a516a-bd97-4484-802b-71eb14f3ca3f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.696131 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0a0a516a-bd97-4484-802b-71eb14f3ca3f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.696279 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0a0a516a-bd97-4484-802b-71eb14f3ca3f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.696500 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0a0a516a-bd97-4484-802b-71eb14f3ca3f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.696640 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0a0a516a-bd97-4484-802b-71eb14f3ca3f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.696958 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0a0a516a-bd97-4484-802b-71eb14f3ca3f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.697099 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0a0a516a-bd97-4484-802b-71eb14f3ca3f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.697563 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0a0a516a-bd97-4484-802b-71eb14f3ca3f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.697681 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0a0a516a-bd97-4484-802b-71eb14f3ca3f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.697986 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0a0a516a-bd97-4484-802b-71eb14f3ca3f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.698262 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw276\" (UniqueName: \"kubernetes.io/projected/0a0a516a-bd97-4484-802b-71eb14f3ca3f-kube-api-access-vw276\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.698294 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.698735 4774 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.702619 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0a0a516a-bd97-4484-802b-71eb14f3ca3f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.703188 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0a0a516a-bd97-4484-802b-71eb14f3ca3f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.704693 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0a0a516a-bd97-4484-802b-71eb14f3ca3f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.713610 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0a0a516a-bd97-4484-802b-71eb14f3ca3f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.717600 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw276\" (UniqueName: \"kubernetes.io/projected/0a0a516a-bd97-4484-802b-71eb14f3ca3f-kube-api-access-vw276\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.725868 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.794749 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.800266 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7fa97e79-a30c-4722-b02b-ec5494bd057c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.800329 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.800350 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7fa97e79-a30c-4722-b02b-ec5494bd057c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.800447 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7fa97e79-a30c-4722-b02b-ec5494bd057c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.800473 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7fa97e79-a30c-4722-b02b-ec5494bd057c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.800615 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7fa97e79-a30c-4722-b02b-ec5494bd057c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.801733 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5r22\" (UniqueName: \"kubernetes.io/projected/7fa97e79-a30c-4722-b02b-ec5494bd057c-kube-api-access-b5r22\") pod \"rabbitmq-server-0\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.801794 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7fa97e79-a30c-4722-b02b-ec5494bd057c-config-data\") pod \"rabbitmq-server-0\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.801887 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7fa97e79-a30c-4722-b02b-ec5494bd057c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.801921 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7fa97e79-a30c-4722-b02b-ec5494bd057c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.801985 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7fa97e79-a30c-4722-b02b-ec5494bd057c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.903502 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7fa97e79-a30c-4722-b02b-ec5494bd057c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.903545 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7fa97e79-a30c-4722-b02b-ec5494bd057c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.903561 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7fa97e79-a30c-4722-b02b-ec5494bd057c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.903578 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5r22\" (UniqueName: \"kubernetes.io/projected/7fa97e79-a30c-4722-b02b-ec5494bd057c-kube-api-access-b5r22\") pod \"rabbitmq-server-0\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.903596 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7fa97e79-a30c-4722-b02b-ec5494bd057c-config-data\") pod \"rabbitmq-server-0\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.903624 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7fa97e79-a30c-4722-b02b-ec5494bd057c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.903637 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7fa97e79-a30c-4722-b02b-ec5494bd057c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.903669 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7fa97e79-a30c-4722-b02b-ec5494bd057c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.903713 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7fa97e79-a30c-4722-b02b-ec5494bd057c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.903760 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.903777 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7fa97e79-a30c-4722-b02b-ec5494bd057c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.905915 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7fa97e79-a30c-4722-b02b-ec5494bd057c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.911476 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7fa97e79-a30c-4722-b02b-ec5494bd057c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.911703 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7fa97e79-a30c-4722-b02b-ec5494bd057c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.911711 4774 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.912050 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7fa97e79-a30c-4722-b02b-ec5494bd057c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.912411 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7fa97e79-a30c-4722-b02b-ec5494bd057c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.912528 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7fa97e79-a30c-4722-b02b-ec5494bd057c-config-data\") pod \"rabbitmq-server-0\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.913521 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7fa97e79-a30c-4722-b02b-ec5494bd057c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.914673 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7fa97e79-a30c-4722-b02b-ec5494bd057c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.918383 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7fa97e79-a30c-4722-b02b-ec5494bd057c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.930488 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5r22\" (UniqueName: \"kubernetes.io/projected/7fa97e79-a30c-4722-b02b-ec5494bd057c-kube-api-access-b5r22\") pod \"rabbitmq-server-0\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.943276 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:58:53 crc kubenswrapper[4774]: I1003 14:58:53.966480 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 14:58:54 crc kubenswrapper[4774]: I1003 14:58:54.203339 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 14:58:54 crc kubenswrapper[4774]: I1003 14:58:54.553792 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 14:58:54 crc kubenswrapper[4774]: I1003 14:58:54.680950 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0a0a516a-bd97-4484-802b-71eb14f3ca3f","Type":"ContainerStarted","Data":"0c2a85bc4b943600c668b1dac977ad3cbb5963f2c4256a283fbe1ed55e791e64"} Oct 03 14:58:54 crc kubenswrapper[4774]: I1003 14:58:54.685885 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7fa97e79-a30c-4722-b02b-ec5494bd057c","Type":"ContainerStarted","Data":"fb486f9b13d9d4abe260a357aa01f2ec006746e3d1b973422ea45d6803516e4a"} Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.126018 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.127163 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.132491 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-xnqzr" Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.132583 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.132730 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.132901 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.137187 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.141657 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.145508 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.236843 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jdwk\" (UniqueName: \"kubernetes.io/projected/27254696-8788-47fe-a9a4-208fd295e427-kube-api-access-5jdwk\") pod \"openstack-cell1-galera-0\" (UID: \"27254696-8788-47fe-a9a4-208fd295e427\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.236902 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/27254696-8788-47fe-a9a4-208fd295e427-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"27254696-8788-47fe-a9a4-208fd295e427\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.236925 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27254696-8788-47fe-a9a4-208fd295e427-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"27254696-8788-47fe-a9a4-208fd295e427\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.236954 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"27254696-8788-47fe-a9a4-208fd295e427\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.236987 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/27254696-8788-47fe-a9a4-208fd295e427-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"27254696-8788-47fe-a9a4-208fd295e427\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.237009 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/27254696-8788-47fe-a9a4-208fd295e427-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"27254696-8788-47fe-a9a4-208fd295e427\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.237047 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/27254696-8788-47fe-a9a4-208fd295e427-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"27254696-8788-47fe-a9a4-208fd295e427\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.237063 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27254696-8788-47fe-a9a4-208fd295e427-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"27254696-8788-47fe-a9a4-208fd295e427\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.237084 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/27254696-8788-47fe-a9a4-208fd295e427-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"27254696-8788-47fe-a9a4-208fd295e427\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.338220 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jdwk\" (UniqueName: \"kubernetes.io/projected/27254696-8788-47fe-a9a4-208fd295e427-kube-api-access-5jdwk\") pod \"openstack-cell1-galera-0\" (UID: \"27254696-8788-47fe-a9a4-208fd295e427\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.338289 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/27254696-8788-47fe-a9a4-208fd295e427-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"27254696-8788-47fe-a9a4-208fd295e427\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.338319 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27254696-8788-47fe-a9a4-208fd295e427-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"27254696-8788-47fe-a9a4-208fd295e427\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.338360 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"27254696-8788-47fe-a9a4-208fd295e427\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.338415 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/27254696-8788-47fe-a9a4-208fd295e427-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"27254696-8788-47fe-a9a4-208fd295e427\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.338445 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/27254696-8788-47fe-a9a4-208fd295e427-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"27254696-8788-47fe-a9a4-208fd295e427\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.338469 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/27254696-8788-47fe-a9a4-208fd295e427-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"27254696-8788-47fe-a9a4-208fd295e427\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.338491 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27254696-8788-47fe-a9a4-208fd295e427-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"27254696-8788-47fe-a9a4-208fd295e427\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.338536 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/27254696-8788-47fe-a9a4-208fd295e427-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"27254696-8788-47fe-a9a4-208fd295e427\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.339073 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/27254696-8788-47fe-a9a4-208fd295e427-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"27254696-8788-47fe-a9a4-208fd295e427\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.340012 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/27254696-8788-47fe-a9a4-208fd295e427-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"27254696-8788-47fe-a9a4-208fd295e427\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.341227 4774 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"27254696-8788-47fe-a9a4-208fd295e427\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-cell1-galera-0" Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.350824 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/27254696-8788-47fe-a9a4-208fd295e427-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"27254696-8788-47fe-a9a4-208fd295e427\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.355825 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/27254696-8788-47fe-a9a4-208fd295e427-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"27254696-8788-47fe-a9a4-208fd295e427\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.355952 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27254696-8788-47fe-a9a4-208fd295e427-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"27254696-8788-47fe-a9a4-208fd295e427\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.356009 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/27254696-8788-47fe-a9a4-208fd295e427-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"27254696-8788-47fe-a9a4-208fd295e427\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.356723 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27254696-8788-47fe-a9a4-208fd295e427-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"27254696-8788-47fe-a9a4-208fd295e427\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.359820 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jdwk\" (UniqueName: \"kubernetes.io/projected/27254696-8788-47fe-a9a4-208fd295e427-kube-api-access-5jdwk\") pod \"openstack-cell1-galera-0\" (UID: \"27254696-8788-47fe-a9a4-208fd295e427\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.367703 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"27254696-8788-47fe-a9a4-208fd295e427\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:58:55 crc kubenswrapper[4774]: I1003 14:58:55.460517 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.166657 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 14:58:56 crc kubenswrapper[4774]: W1003 14:58:56.178496 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27254696_8788_47fe_a9a4_208fd295e427.slice/crio-d77d9d3ad94270456ab4192ca78b9fc6fd0850c1b1c01ef62731186012d3db02 WatchSource:0}: Error finding container d77d9d3ad94270456ab4192ca78b9fc6fd0850c1b1c01ef62731186012d3db02: Status 404 returned error can't find the container with id d77d9d3ad94270456ab4192ca78b9fc6fd0850c1b1c01ef62731186012d3db02 Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.364674 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.365821 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.371710 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.372069 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.372978 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-gwh6l" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.375149 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.376880 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.473665 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9rx9\" (UniqueName: \"kubernetes.io/projected/a5455d9b-4489-4041-b44d-990124dd84e4-kube-api-access-c9rx9\") pod \"openstack-galera-0\" (UID: \"a5455d9b-4489-4041-b44d-990124dd84e4\") " pod="openstack/openstack-galera-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.473731 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5455d9b-4489-4041-b44d-990124dd84e4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a5455d9b-4489-4041-b44d-990124dd84e4\") " pod="openstack/openstack-galera-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.473750 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5455d9b-4489-4041-b44d-990124dd84e4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a5455d9b-4489-4041-b44d-990124dd84e4\") " pod="openstack/openstack-galera-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.473775 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a5455d9b-4489-4041-b44d-990124dd84e4-kolla-config\") pod \"openstack-galera-0\" (UID: \"a5455d9b-4489-4041-b44d-990124dd84e4\") " pod="openstack/openstack-galera-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.473889 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a5455d9b-4489-4041-b44d-990124dd84e4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a5455d9b-4489-4041-b44d-990124dd84e4\") " pod="openstack/openstack-galera-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.473935 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"a5455d9b-4489-4041-b44d-990124dd84e4\") " pod="openstack/openstack-galera-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.474036 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a5455d9b-4489-4041-b44d-990124dd84e4-config-data-default\") pod \"openstack-galera-0\" (UID: \"a5455d9b-4489-4041-b44d-990124dd84e4\") " pod="openstack/openstack-galera-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.474163 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/a5455d9b-4489-4041-b44d-990124dd84e4-secrets\") pod \"openstack-galera-0\" (UID: \"a5455d9b-4489-4041-b44d-990124dd84e4\") " pod="openstack/openstack-galera-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.474186 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5455d9b-4489-4041-b44d-990124dd84e4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a5455d9b-4489-4041-b44d-990124dd84e4\") " pod="openstack/openstack-galera-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.575289 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/a5455d9b-4489-4041-b44d-990124dd84e4-secrets\") pod \"openstack-galera-0\" (UID: \"a5455d9b-4489-4041-b44d-990124dd84e4\") " pod="openstack/openstack-galera-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.575323 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5455d9b-4489-4041-b44d-990124dd84e4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a5455d9b-4489-4041-b44d-990124dd84e4\") " pod="openstack/openstack-galera-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.575360 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9rx9\" (UniqueName: \"kubernetes.io/projected/a5455d9b-4489-4041-b44d-990124dd84e4-kube-api-access-c9rx9\") pod \"openstack-galera-0\" (UID: \"a5455d9b-4489-4041-b44d-990124dd84e4\") " pod="openstack/openstack-galera-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.575398 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5455d9b-4489-4041-b44d-990124dd84e4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a5455d9b-4489-4041-b44d-990124dd84e4\") " pod="openstack/openstack-galera-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.575417 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5455d9b-4489-4041-b44d-990124dd84e4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a5455d9b-4489-4041-b44d-990124dd84e4\") " pod="openstack/openstack-galera-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.575438 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a5455d9b-4489-4041-b44d-990124dd84e4-kolla-config\") pod \"openstack-galera-0\" (UID: \"a5455d9b-4489-4041-b44d-990124dd84e4\") " pod="openstack/openstack-galera-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.575457 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a5455d9b-4489-4041-b44d-990124dd84e4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a5455d9b-4489-4041-b44d-990124dd84e4\") " pod="openstack/openstack-galera-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.575490 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"a5455d9b-4489-4041-b44d-990124dd84e4\") " pod="openstack/openstack-galera-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.575526 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a5455d9b-4489-4041-b44d-990124dd84e4-config-data-default\") pod \"openstack-galera-0\" (UID: \"a5455d9b-4489-4041-b44d-990124dd84e4\") " pod="openstack/openstack-galera-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.577577 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a5455d9b-4489-4041-b44d-990124dd84e4-config-data-default\") pod \"openstack-galera-0\" (UID: \"a5455d9b-4489-4041-b44d-990124dd84e4\") " pod="openstack/openstack-galera-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.578527 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a5455d9b-4489-4041-b44d-990124dd84e4-kolla-config\") pod \"openstack-galera-0\" (UID: \"a5455d9b-4489-4041-b44d-990124dd84e4\") " pod="openstack/openstack-galera-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.578618 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5455d9b-4489-4041-b44d-990124dd84e4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a5455d9b-4489-4041-b44d-990124dd84e4\") " pod="openstack/openstack-galera-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.578766 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a5455d9b-4489-4041-b44d-990124dd84e4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a5455d9b-4489-4041-b44d-990124dd84e4\") " pod="openstack/openstack-galera-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.578940 4774 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"a5455d9b-4489-4041-b44d-990124dd84e4\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-galera-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.588965 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5455d9b-4489-4041-b44d-990124dd84e4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a5455d9b-4489-4041-b44d-990124dd84e4\") " pod="openstack/openstack-galera-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.588973 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5455d9b-4489-4041-b44d-990124dd84e4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a5455d9b-4489-4041-b44d-990124dd84e4\") " pod="openstack/openstack-galera-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.589882 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/a5455d9b-4489-4041-b44d-990124dd84e4-secrets\") pod \"openstack-galera-0\" (UID: \"a5455d9b-4489-4041-b44d-990124dd84e4\") " pod="openstack/openstack-galera-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.609002 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9rx9\" (UniqueName: \"kubernetes.io/projected/a5455d9b-4489-4041-b44d-990124dd84e4-kube-api-access-c9rx9\") pod \"openstack-galera-0\" (UID: \"a5455d9b-4489-4041-b44d-990124dd84e4\") " pod="openstack/openstack-galera-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.633286 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"a5455d9b-4489-4041-b44d-990124dd84e4\") " pod="openstack/openstack-galera-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.698582 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.698895 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.699981 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.702601 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.707825 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.708054 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-zfntg" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.712088 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.762614 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"27254696-8788-47fe-a9a4-208fd295e427","Type":"ContainerStarted","Data":"d77d9d3ad94270456ab4192ca78b9fc6fd0850c1b1c01ef62731186012d3db02"} Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.780993 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rfhp\" (UniqueName: \"kubernetes.io/projected/09034f5f-3011-4604-8b05-f8a3fef6a74a-kube-api-access-8rfhp\") pod \"memcached-0\" (UID: \"09034f5f-3011-4604-8b05-f8a3fef6a74a\") " pod="openstack/memcached-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.781051 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09034f5f-3011-4604-8b05-f8a3fef6a74a-combined-ca-bundle\") pod \"memcached-0\" (UID: \"09034f5f-3011-4604-8b05-f8a3fef6a74a\") " pod="openstack/memcached-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.781096 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/09034f5f-3011-4604-8b05-f8a3fef6a74a-config-data\") pod \"memcached-0\" (UID: \"09034f5f-3011-4604-8b05-f8a3fef6a74a\") " pod="openstack/memcached-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.781111 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/09034f5f-3011-4604-8b05-f8a3fef6a74a-memcached-tls-certs\") pod \"memcached-0\" (UID: \"09034f5f-3011-4604-8b05-f8a3fef6a74a\") " pod="openstack/memcached-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.781155 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/09034f5f-3011-4604-8b05-f8a3fef6a74a-kolla-config\") pod \"memcached-0\" (UID: \"09034f5f-3011-4604-8b05-f8a3fef6a74a\") " pod="openstack/memcached-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.883779 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/09034f5f-3011-4604-8b05-f8a3fef6a74a-kolla-config\") pod \"memcached-0\" (UID: \"09034f5f-3011-4604-8b05-f8a3fef6a74a\") " pod="openstack/memcached-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.884429 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/09034f5f-3011-4604-8b05-f8a3fef6a74a-kolla-config\") pod \"memcached-0\" (UID: \"09034f5f-3011-4604-8b05-f8a3fef6a74a\") " pod="openstack/memcached-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.884579 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rfhp\" (UniqueName: \"kubernetes.io/projected/09034f5f-3011-4604-8b05-f8a3fef6a74a-kube-api-access-8rfhp\") pod \"memcached-0\" (UID: \"09034f5f-3011-4604-8b05-f8a3fef6a74a\") " pod="openstack/memcached-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.884713 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09034f5f-3011-4604-8b05-f8a3fef6a74a-combined-ca-bundle\") pod \"memcached-0\" (UID: \"09034f5f-3011-4604-8b05-f8a3fef6a74a\") " pod="openstack/memcached-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.885156 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/09034f5f-3011-4604-8b05-f8a3fef6a74a-config-data\") pod \"memcached-0\" (UID: \"09034f5f-3011-4604-8b05-f8a3fef6a74a\") " pod="openstack/memcached-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.885255 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/09034f5f-3011-4604-8b05-f8a3fef6a74a-memcached-tls-certs\") pod \"memcached-0\" (UID: \"09034f5f-3011-4604-8b05-f8a3fef6a74a\") " pod="openstack/memcached-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.887075 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/09034f5f-3011-4604-8b05-f8a3fef6a74a-config-data\") pod \"memcached-0\" (UID: \"09034f5f-3011-4604-8b05-f8a3fef6a74a\") " pod="openstack/memcached-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.890284 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/09034f5f-3011-4604-8b05-f8a3fef6a74a-memcached-tls-certs\") pod \"memcached-0\" (UID: \"09034f5f-3011-4604-8b05-f8a3fef6a74a\") " pod="openstack/memcached-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.890759 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09034f5f-3011-4604-8b05-f8a3fef6a74a-combined-ca-bundle\") pod \"memcached-0\" (UID: \"09034f5f-3011-4604-8b05-f8a3fef6a74a\") " pod="openstack/memcached-0" Oct 03 14:58:56 crc kubenswrapper[4774]: I1003 14:58:56.909938 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rfhp\" (UniqueName: \"kubernetes.io/projected/09034f5f-3011-4604-8b05-f8a3fef6a74a-kube-api-access-8rfhp\") pod \"memcached-0\" (UID: \"09034f5f-3011-4604-8b05-f8a3fef6a74a\") " pod="openstack/memcached-0" Oct 03 14:58:57 crc kubenswrapper[4774]: I1003 14:58:57.028674 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 03 14:58:57 crc kubenswrapper[4774]: I1003 14:58:57.335414 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 03 14:58:57 crc kubenswrapper[4774]: I1003 14:58:57.579193 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 03 14:58:57 crc kubenswrapper[4774]: W1003 14:58:57.595519 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09034f5f_3011_4604_8b05_f8a3fef6a74a.slice/crio-904ba40cd3a6a622078bb5206d0e900132bae8bc677df9b6fb09d8a95fc1bc0e WatchSource:0}: Error finding container 904ba40cd3a6a622078bb5206d0e900132bae8bc677df9b6fb09d8a95fc1bc0e: Status 404 returned error can't find the container with id 904ba40cd3a6a622078bb5206d0e900132bae8bc677df9b6fb09d8a95fc1bc0e Oct 03 14:58:57 crc kubenswrapper[4774]: I1003 14:58:57.788292 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"09034f5f-3011-4604-8b05-f8a3fef6a74a","Type":"ContainerStarted","Data":"904ba40cd3a6a622078bb5206d0e900132bae8bc677df9b6fb09d8a95fc1bc0e"} Oct 03 14:58:57 crc kubenswrapper[4774]: I1003 14:58:57.798000 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a5455d9b-4489-4041-b44d-990124dd84e4","Type":"ContainerStarted","Data":"02167502b1116f10bba4481a128028417e8f3b066c29ff9bef2b830137c8ed6a"} Oct 03 14:58:58 crc kubenswrapper[4774]: I1003 14:58:58.331519 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 14:58:58 crc kubenswrapper[4774]: I1003 14:58:58.341607 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 14:58:58 crc kubenswrapper[4774]: I1003 14:58:58.354547 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-j4j67" Oct 03 14:58:58 crc kubenswrapper[4774]: I1003 14:58:58.364822 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 14:58:58 crc kubenswrapper[4774]: I1003 14:58:58.415280 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjjfp\" (UniqueName: \"kubernetes.io/projected/dc5407ed-2e7c-4f5e-b24a-f040659e71f1-kube-api-access-sjjfp\") pod \"kube-state-metrics-0\" (UID: \"dc5407ed-2e7c-4f5e-b24a-f040659e71f1\") " pod="openstack/kube-state-metrics-0" Oct 03 14:58:58 crc kubenswrapper[4774]: I1003 14:58:58.523189 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjjfp\" (UniqueName: \"kubernetes.io/projected/dc5407ed-2e7c-4f5e-b24a-f040659e71f1-kube-api-access-sjjfp\") pod \"kube-state-metrics-0\" (UID: \"dc5407ed-2e7c-4f5e-b24a-f040659e71f1\") " pod="openstack/kube-state-metrics-0" Oct 03 14:58:58 crc kubenswrapper[4774]: I1003 14:58:58.571592 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjjfp\" (UniqueName: \"kubernetes.io/projected/dc5407ed-2e7c-4f5e-b24a-f040659e71f1-kube-api-access-sjjfp\") pod \"kube-state-metrics-0\" (UID: \"dc5407ed-2e7c-4f5e-b24a-f040659e71f1\") " pod="openstack/kube-state-metrics-0" Oct 03 14:58:58 crc kubenswrapper[4774]: I1003 14:58:58.674042 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 14:58:59 crc kubenswrapper[4774]: I1003 14:58:59.262092 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 14:58:59 crc kubenswrapper[4774]: I1003 14:58:59.852399 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"dc5407ed-2e7c-4f5e-b24a-f040659e71f1","Type":"ContainerStarted","Data":"c18661d3fffda6642bd4b7a66fd89d9daf7f14954054f1f1714bc4aafd7d7921"} Oct 03 14:59:02 crc kubenswrapper[4774]: I1003 14:59:02.928065 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-4wgb7"] Oct 03 14:59:02 crc kubenswrapper[4774]: I1003 14:59:02.929993 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4wgb7" Oct 03 14:59:02 crc kubenswrapper[4774]: I1003 14:59:02.931511 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 03 14:59:02 crc kubenswrapper[4774]: I1003 14:59:02.932716 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-6rdtc" Oct 03 14:59:02 crc kubenswrapper[4774]: I1003 14:59:02.933005 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 03 14:59:02 crc kubenswrapper[4774]: I1003 14:59:02.938238 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-bhmcl"] Oct 03 14:59:02 crc kubenswrapper[4774]: I1003 14:59:02.940397 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-bhmcl" Oct 03 14:59:02 crc kubenswrapper[4774]: I1003 14:59:02.951014 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-4wgb7"] Oct 03 14:59:02 crc kubenswrapper[4774]: I1003 14:59:02.961790 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-bhmcl"] Oct 03 14:59:02 crc kubenswrapper[4774]: I1003 14:59:02.992527 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9111154-59d2-4b07-b8c3-db1870883cde-ovn-controller-tls-certs\") pod \"ovn-controller-4wgb7\" (UID: \"b9111154-59d2-4b07-b8c3-db1870883cde\") " pod="openstack/ovn-controller-4wgb7" Oct 03 14:59:02 crc kubenswrapper[4774]: I1003 14:59:02.992626 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp5ws\" (UniqueName: \"kubernetes.io/projected/b9111154-59d2-4b07-b8c3-db1870883cde-kube-api-access-tp5ws\") pod \"ovn-controller-4wgb7\" (UID: \"b9111154-59d2-4b07-b8c3-db1870883cde\") " pod="openstack/ovn-controller-4wgb7" Oct 03 14:59:02 crc kubenswrapper[4774]: I1003 14:59:02.992657 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b9111154-59d2-4b07-b8c3-db1870883cde-var-run-ovn\") pod \"ovn-controller-4wgb7\" (UID: \"b9111154-59d2-4b07-b8c3-db1870883cde\") " pod="openstack/ovn-controller-4wgb7" Oct 03 14:59:02 crc kubenswrapper[4774]: I1003 14:59:02.992696 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b9111154-59d2-4b07-b8c3-db1870883cde-var-log-ovn\") pod \"ovn-controller-4wgb7\" (UID: \"b9111154-59d2-4b07-b8c3-db1870883cde\") " pod="openstack/ovn-controller-4wgb7" Oct 03 14:59:02 crc kubenswrapper[4774]: I1003 14:59:02.992743 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9111154-59d2-4b07-b8c3-db1870883cde-combined-ca-bundle\") pod \"ovn-controller-4wgb7\" (UID: \"b9111154-59d2-4b07-b8c3-db1870883cde\") " pod="openstack/ovn-controller-4wgb7" Oct 03 14:59:02 crc kubenswrapper[4774]: I1003 14:59:02.992777 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9111154-59d2-4b07-b8c3-db1870883cde-scripts\") pod \"ovn-controller-4wgb7\" (UID: \"b9111154-59d2-4b07-b8c3-db1870883cde\") " pod="openstack/ovn-controller-4wgb7" Oct 03 14:59:02 crc kubenswrapper[4774]: I1003 14:59:02.992822 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b9111154-59d2-4b07-b8c3-db1870883cde-var-run\") pod \"ovn-controller-4wgb7\" (UID: \"b9111154-59d2-4b07-b8c3-db1870883cde\") " pod="openstack/ovn-controller-4wgb7" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.094100 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/03ee58bb-cd78-4fdb-986f-a9b60f9998e8-etc-ovs\") pod \"ovn-controller-ovs-bhmcl\" (UID: \"03ee58bb-cd78-4fdb-986f-a9b60f9998e8\") " pod="openstack/ovn-controller-ovs-bhmcl" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.094151 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp5ws\" (UniqueName: \"kubernetes.io/projected/b9111154-59d2-4b07-b8c3-db1870883cde-kube-api-access-tp5ws\") pod \"ovn-controller-4wgb7\" (UID: \"b9111154-59d2-4b07-b8c3-db1870883cde\") " pod="openstack/ovn-controller-4wgb7" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.094172 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b9111154-59d2-4b07-b8c3-db1870883cde-var-run-ovn\") pod \"ovn-controller-4wgb7\" (UID: \"b9111154-59d2-4b07-b8c3-db1870883cde\") " pod="openstack/ovn-controller-4wgb7" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.094195 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/03ee58bb-cd78-4fdb-986f-a9b60f9998e8-var-log\") pod \"ovn-controller-ovs-bhmcl\" (UID: \"03ee58bb-cd78-4fdb-986f-a9b60f9998e8\") " pod="openstack/ovn-controller-ovs-bhmcl" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.094214 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b9111154-59d2-4b07-b8c3-db1870883cde-var-log-ovn\") pod \"ovn-controller-4wgb7\" (UID: \"b9111154-59d2-4b07-b8c3-db1870883cde\") " pod="openstack/ovn-controller-4wgb7" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.094231 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/03ee58bb-cd78-4fdb-986f-a9b60f9998e8-var-lib\") pod \"ovn-controller-ovs-bhmcl\" (UID: \"03ee58bb-cd78-4fdb-986f-a9b60f9998e8\") " pod="openstack/ovn-controller-ovs-bhmcl" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.094270 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9111154-59d2-4b07-b8c3-db1870883cde-combined-ca-bundle\") pod \"ovn-controller-4wgb7\" (UID: \"b9111154-59d2-4b07-b8c3-db1870883cde\") " pod="openstack/ovn-controller-4wgb7" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.094295 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2qgg\" (UniqueName: \"kubernetes.io/projected/03ee58bb-cd78-4fdb-986f-a9b60f9998e8-kube-api-access-z2qgg\") pod \"ovn-controller-ovs-bhmcl\" (UID: \"03ee58bb-cd78-4fdb-986f-a9b60f9998e8\") " pod="openstack/ovn-controller-ovs-bhmcl" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.094313 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9111154-59d2-4b07-b8c3-db1870883cde-scripts\") pod \"ovn-controller-4wgb7\" (UID: \"b9111154-59d2-4b07-b8c3-db1870883cde\") " pod="openstack/ovn-controller-4wgb7" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.094363 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/03ee58bb-cd78-4fdb-986f-a9b60f9998e8-var-run\") pod \"ovn-controller-ovs-bhmcl\" (UID: \"03ee58bb-cd78-4fdb-986f-a9b60f9998e8\") " pod="openstack/ovn-controller-ovs-bhmcl" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.094403 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b9111154-59d2-4b07-b8c3-db1870883cde-var-run\") pod \"ovn-controller-4wgb7\" (UID: \"b9111154-59d2-4b07-b8c3-db1870883cde\") " pod="openstack/ovn-controller-4wgb7" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.094425 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9111154-59d2-4b07-b8c3-db1870883cde-ovn-controller-tls-certs\") pod \"ovn-controller-4wgb7\" (UID: \"b9111154-59d2-4b07-b8c3-db1870883cde\") " pod="openstack/ovn-controller-4wgb7" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.094444 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/03ee58bb-cd78-4fdb-986f-a9b60f9998e8-scripts\") pod \"ovn-controller-ovs-bhmcl\" (UID: \"03ee58bb-cd78-4fdb-986f-a9b60f9998e8\") " pod="openstack/ovn-controller-ovs-bhmcl" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.096491 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b9111154-59d2-4b07-b8c3-db1870883cde-var-run-ovn\") pod \"ovn-controller-4wgb7\" (UID: \"b9111154-59d2-4b07-b8c3-db1870883cde\") " pod="openstack/ovn-controller-4wgb7" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.096633 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b9111154-59d2-4b07-b8c3-db1870883cde-var-run\") pod \"ovn-controller-4wgb7\" (UID: \"b9111154-59d2-4b07-b8c3-db1870883cde\") " pod="openstack/ovn-controller-4wgb7" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.096839 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b9111154-59d2-4b07-b8c3-db1870883cde-var-log-ovn\") pod \"ovn-controller-4wgb7\" (UID: \"b9111154-59d2-4b07-b8c3-db1870883cde\") " pod="openstack/ovn-controller-4wgb7" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.096994 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9111154-59d2-4b07-b8c3-db1870883cde-scripts\") pod \"ovn-controller-4wgb7\" (UID: \"b9111154-59d2-4b07-b8c3-db1870883cde\") " pod="openstack/ovn-controller-4wgb7" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.100127 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9111154-59d2-4b07-b8c3-db1870883cde-ovn-controller-tls-certs\") pod \"ovn-controller-4wgb7\" (UID: \"b9111154-59d2-4b07-b8c3-db1870883cde\") " pod="openstack/ovn-controller-4wgb7" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.104914 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9111154-59d2-4b07-b8c3-db1870883cde-combined-ca-bundle\") pod \"ovn-controller-4wgb7\" (UID: \"b9111154-59d2-4b07-b8c3-db1870883cde\") " pod="openstack/ovn-controller-4wgb7" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.117151 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp5ws\" (UniqueName: \"kubernetes.io/projected/b9111154-59d2-4b07-b8c3-db1870883cde-kube-api-access-tp5ws\") pod \"ovn-controller-4wgb7\" (UID: \"b9111154-59d2-4b07-b8c3-db1870883cde\") " pod="openstack/ovn-controller-4wgb7" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.195895 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/03ee58bb-cd78-4fdb-986f-a9b60f9998e8-var-log\") pod \"ovn-controller-ovs-bhmcl\" (UID: \"03ee58bb-cd78-4fdb-986f-a9b60f9998e8\") " pod="openstack/ovn-controller-ovs-bhmcl" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.195973 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/03ee58bb-cd78-4fdb-986f-a9b60f9998e8-var-lib\") pod \"ovn-controller-ovs-bhmcl\" (UID: \"03ee58bb-cd78-4fdb-986f-a9b60f9998e8\") " pod="openstack/ovn-controller-ovs-bhmcl" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.196064 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2qgg\" (UniqueName: \"kubernetes.io/projected/03ee58bb-cd78-4fdb-986f-a9b60f9998e8-kube-api-access-z2qgg\") pod \"ovn-controller-ovs-bhmcl\" (UID: \"03ee58bb-cd78-4fdb-986f-a9b60f9998e8\") " pod="openstack/ovn-controller-ovs-bhmcl" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.196125 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/03ee58bb-cd78-4fdb-986f-a9b60f9998e8-var-run\") pod \"ovn-controller-ovs-bhmcl\" (UID: \"03ee58bb-cd78-4fdb-986f-a9b60f9998e8\") " pod="openstack/ovn-controller-ovs-bhmcl" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.196150 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/03ee58bb-cd78-4fdb-986f-a9b60f9998e8-scripts\") pod \"ovn-controller-ovs-bhmcl\" (UID: \"03ee58bb-cd78-4fdb-986f-a9b60f9998e8\") " pod="openstack/ovn-controller-ovs-bhmcl" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.196202 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/03ee58bb-cd78-4fdb-986f-a9b60f9998e8-etc-ovs\") pod \"ovn-controller-ovs-bhmcl\" (UID: \"03ee58bb-cd78-4fdb-986f-a9b60f9998e8\") " pod="openstack/ovn-controller-ovs-bhmcl" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.196212 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/03ee58bb-cd78-4fdb-986f-a9b60f9998e8-var-log\") pod \"ovn-controller-ovs-bhmcl\" (UID: \"03ee58bb-cd78-4fdb-986f-a9b60f9998e8\") " pod="openstack/ovn-controller-ovs-bhmcl" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.196289 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/03ee58bb-cd78-4fdb-986f-a9b60f9998e8-var-lib\") pod \"ovn-controller-ovs-bhmcl\" (UID: \"03ee58bb-cd78-4fdb-986f-a9b60f9998e8\") " pod="openstack/ovn-controller-ovs-bhmcl" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.198626 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/03ee58bb-cd78-4fdb-986f-a9b60f9998e8-etc-ovs\") pod \"ovn-controller-ovs-bhmcl\" (UID: \"03ee58bb-cd78-4fdb-986f-a9b60f9998e8\") " pod="openstack/ovn-controller-ovs-bhmcl" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.201503 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/03ee58bb-cd78-4fdb-986f-a9b60f9998e8-scripts\") pod \"ovn-controller-ovs-bhmcl\" (UID: \"03ee58bb-cd78-4fdb-986f-a9b60f9998e8\") " pod="openstack/ovn-controller-ovs-bhmcl" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.202497 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/03ee58bb-cd78-4fdb-986f-a9b60f9998e8-var-run\") pod \"ovn-controller-ovs-bhmcl\" (UID: \"03ee58bb-cd78-4fdb-986f-a9b60f9998e8\") " pod="openstack/ovn-controller-ovs-bhmcl" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.213155 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2qgg\" (UniqueName: \"kubernetes.io/projected/03ee58bb-cd78-4fdb-986f-a9b60f9998e8-kube-api-access-z2qgg\") pod \"ovn-controller-ovs-bhmcl\" (UID: \"03ee58bb-cd78-4fdb-986f-a9b60f9998e8\") " pod="openstack/ovn-controller-ovs-bhmcl" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.253956 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4wgb7" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.262404 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-bhmcl" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.849608 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.851840 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.859461 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.859784 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.861527 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.861706 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-wtqxp" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.861861 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 03 14:59:03 crc kubenswrapper[4774]: I1003 14:59:03.874763 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 14:59:04 crc kubenswrapper[4774]: I1003 14:59:04.010910 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:59:04 crc kubenswrapper[4774]: I1003 14:59:04.010961 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66-config\") pod \"ovsdbserver-nb-0\" (UID: \"fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:59:04 crc kubenswrapper[4774]: I1003 14:59:04.010994 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:59:04 crc kubenswrapper[4774]: I1003 14:59:04.011013 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:59:04 crc kubenswrapper[4774]: I1003 14:59:04.011035 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:59:04 crc kubenswrapper[4774]: I1003 14:59:04.011087 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:59:04 crc kubenswrapper[4774]: I1003 14:59:04.011107 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:59:04 crc kubenswrapper[4774]: I1003 14:59:04.011126 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt4x7\" (UniqueName: \"kubernetes.io/projected/fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66-kube-api-access-mt4x7\") pod \"ovsdbserver-nb-0\" (UID: \"fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:59:04 crc kubenswrapper[4774]: I1003 14:59:04.112588 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:59:04 crc kubenswrapper[4774]: I1003 14:59:04.112639 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66-config\") pod \"ovsdbserver-nb-0\" (UID: \"fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:59:04 crc kubenswrapper[4774]: I1003 14:59:04.112672 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:59:04 crc kubenswrapper[4774]: I1003 14:59:04.112690 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:59:04 crc kubenswrapper[4774]: I1003 14:59:04.112715 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:59:04 crc kubenswrapper[4774]: I1003 14:59:04.112777 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:59:04 crc kubenswrapper[4774]: I1003 14:59:04.112803 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:59:04 crc kubenswrapper[4774]: I1003 14:59:04.112830 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt4x7\" (UniqueName: \"kubernetes.io/projected/fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66-kube-api-access-mt4x7\") pod \"ovsdbserver-nb-0\" (UID: \"fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:59:04 crc kubenswrapper[4774]: I1003 14:59:04.113486 4774 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Oct 03 14:59:04 crc kubenswrapper[4774]: I1003 14:59:04.114696 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66-config\") pod \"ovsdbserver-nb-0\" (UID: \"fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:59:04 crc kubenswrapper[4774]: I1003 14:59:04.118421 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:59:04 crc kubenswrapper[4774]: I1003 14:59:04.118678 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:59:04 crc kubenswrapper[4774]: I1003 14:59:04.118766 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:59:04 crc kubenswrapper[4774]: I1003 14:59:04.120340 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:59:04 crc kubenswrapper[4774]: I1003 14:59:04.123929 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:59:04 crc kubenswrapper[4774]: I1003 14:59:04.132698 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt4x7\" (UniqueName: \"kubernetes.io/projected/fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66-kube-api-access-mt4x7\") pod \"ovsdbserver-nb-0\" (UID: \"fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:59:04 crc kubenswrapper[4774]: I1003 14:59:04.137300 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:59:04 crc kubenswrapper[4774]: I1003 14:59:04.172507 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 03 14:59:05 crc kubenswrapper[4774]: I1003 14:59:05.581000 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 14:59:05 crc kubenswrapper[4774]: I1003 14:59:05.582882 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 03 14:59:05 crc kubenswrapper[4774]: I1003 14:59:05.585841 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-9vk6p" Oct 03 14:59:05 crc kubenswrapper[4774]: I1003 14:59:05.585941 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 03 14:59:05 crc kubenswrapper[4774]: I1003 14:59:05.586124 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 03 14:59:05 crc kubenswrapper[4774]: I1003 14:59:05.587462 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 03 14:59:05 crc kubenswrapper[4774]: I1003 14:59:05.605754 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 14:59:05 crc kubenswrapper[4774]: I1003 14:59:05.638192 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq2jk\" (UniqueName: \"kubernetes.io/projected/c880318a-5ff5-46f8-aca9-134c52ed3ad1-kube-api-access-zq2jk\") pod \"ovsdbserver-sb-0\" (UID: \"c880318a-5ff5-46f8-aca9-134c52ed3ad1\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:59:05 crc kubenswrapper[4774]: I1003 14:59:05.638269 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c880318a-5ff5-46f8-aca9-134c52ed3ad1-config\") pod \"ovsdbserver-sb-0\" (UID: \"c880318a-5ff5-46f8-aca9-134c52ed3ad1\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:59:05 crc kubenswrapper[4774]: I1003 14:59:05.638299 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c880318a-5ff5-46f8-aca9-134c52ed3ad1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c880318a-5ff5-46f8-aca9-134c52ed3ad1\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:59:05 crc kubenswrapper[4774]: I1003 14:59:05.638340 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c880318a-5ff5-46f8-aca9-134c52ed3ad1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c880318a-5ff5-46f8-aca9-134c52ed3ad1\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:59:05 crc kubenswrapper[4774]: I1003 14:59:05.638551 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c880318a-5ff5-46f8-aca9-134c52ed3ad1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c880318a-5ff5-46f8-aca9-134c52ed3ad1\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:59:05 crc kubenswrapper[4774]: I1003 14:59:05.638729 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c880318a-5ff5-46f8-aca9-134c52ed3ad1\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:59:05 crc kubenswrapper[4774]: I1003 14:59:05.638841 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c880318a-5ff5-46f8-aca9-134c52ed3ad1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c880318a-5ff5-46f8-aca9-134c52ed3ad1\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:59:05 crc kubenswrapper[4774]: I1003 14:59:05.638867 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c880318a-5ff5-46f8-aca9-134c52ed3ad1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c880318a-5ff5-46f8-aca9-134c52ed3ad1\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:59:05 crc kubenswrapper[4774]: I1003 14:59:05.740199 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq2jk\" (UniqueName: \"kubernetes.io/projected/c880318a-5ff5-46f8-aca9-134c52ed3ad1-kube-api-access-zq2jk\") pod \"ovsdbserver-sb-0\" (UID: \"c880318a-5ff5-46f8-aca9-134c52ed3ad1\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:59:05 crc kubenswrapper[4774]: I1003 14:59:05.740274 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c880318a-5ff5-46f8-aca9-134c52ed3ad1-config\") pod \"ovsdbserver-sb-0\" (UID: \"c880318a-5ff5-46f8-aca9-134c52ed3ad1\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:59:05 crc kubenswrapper[4774]: I1003 14:59:05.740302 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c880318a-5ff5-46f8-aca9-134c52ed3ad1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c880318a-5ff5-46f8-aca9-134c52ed3ad1\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:59:05 crc kubenswrapper[4774]: I1003 14:59:05.740343 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c880318a-5ff5-46f8-aca9-134c52ed3ad1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c880318a-5ff5-46f8-aca9-134c52ed3ad1\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:59:05 crc kubenswrapper[4774]: I1003 14:59:05.740445 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c880318a-5ff5-46f8-aca9-134c52ed3ad1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c880318a-5ff5-46f8-aca9-134c52ed3ad1\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:59:05 crc kubenswrapper[4774]: I1003 14:59:05.740494 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c880318a-5ff5-46f8-aca9-134c52ed3ad1\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:59:05 crc kubenswrapper[4774]: I1003 14:59:05.740545 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c880318a-5ff5-46f8-aca9-134c52ed3ad1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c880318a-5ff5-46f8-aca9-134c52ed3ad1\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:59:05 crc kubenswrapper[4774]: I1003 14:59:05.740577 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c880318a-5ff5-46f8-aca9-134c52ed3ad1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c880318a-5ff5-46f8-aca9-134c52ed3ad1\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:59:05 crc kubenswrapper[4774]: I1003 14:59:05.741238 4774 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c880318a-5ff5-46f8-aca9-134c52ed3ad1\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-sb-0" Oct 03 14:59:05 crc kubenswrapper[4774]: I1003 14:59:05.742295 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c880318a-5ff5-46f8-aca9-134c52ed3ad1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c880318a-5ff5-46f8-aca9-134c52ed3ad1\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:59:05 crc kubenswrapper[4774]: I1003 14:59:05.742523 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c880318a-5ff5-46f8-aca9-134c52ed3ad1-config\") pod \"ovsdbserver-sb-0\" (UID: \"c880318a-5ff5-46f8-aca9-134c52ed3ad1\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:59:05 crc kubenswrapper[4774]: I1003 14:59:05.742567 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c880318a-5ff5-46f8-aca9-134c52ed3ad1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c880318a-5ff5-46f8-aca9-134c52ed3ad1\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:59:05 crc kubenswrapper[4774]: I1003 14:59:05.747077 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c880318a-5ff5-46f8-aca9-134c52ed3ad1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c880318a-5ff5-46f8-aca9-134c52ed3ad1\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:59:05 crc kubenswrapper[4774]: I1003 14:59:05.747119 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c880318a-5ff5-46f8-aca9-134c52ed3ad1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c880318a-5ff5-46f8-aca9-134c52ed3ad1\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:59:05 crc kubenswrapper[4774]: I1003 14:59:05.747474 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c880318a-5ff5-46f8-aca9-134c52ed3ad1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c880318a-5ff5-46f8-aca9-134c52ed3ad1\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:59:05 crc kubenswrapper[4774]: I1003 14:59:05.763622 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq2jk\" (UniqueName: \"kubernetes.io/projected/c880318a-5ff5-46f8-aca9-134c52ed3ad1-kube-api-access-zq2jk\") pod \"ovsdbserver-sb-0\" (UID: \"c880318a-5ff5-46f8-aca9-134c52ed3ad1\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:59:05 crc kubenswrapper[4774]: I1003 14:59:05.770023 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c880318a-5ff5-46f8-aca9-134c52ed3ad1\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:59:05 crc kubenswrapper[4774]: I1003 14:59:05.902310 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 03 14:59:09 crc kubenswrapper[4774]: E1003 14:59:09.810387 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Oct 03 14:59:09 crc kubenswrapper[4774]: E1003 14:59:09.811150 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b5r22,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(7fa97e79-a30c-4722-b02b-ec5494bd057c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 14:59:09 crc kubenswrapper[4774]: E1003 14:59:09.812350 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="7fa97e79-a30c-4722-b02b-ec5494bd057c" Oct 03 14:59:09 crc kubenswrapper[4774]: E1003 14:59:09.962404 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="7fa97e79-a30c-4722-b02b-ec5494bd057c" Oct 03 14:59:10 crc kubenswrapper[4774]: E1003 14:59:10.511925 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Oct 03 14:59:10 crc kubenswrapper[4774]: E1003 14:59:10.512122 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vw276,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(0a0a516a-bd97-4484-802b-71eb14f3ca3f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 14:59:10 crc kubenswrapper[4774]: E1003 14:59:10.513275 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="0a0a516a-bd97-4484-802b-71eb14f3ca3f" Oct 03 14:59:10 crc kubenswrapper[4774]: E1003 14:59:10.973621 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="0a0a516a-bd97-4484-802b-71eb14f3ca3f" Oct 03 14:59:19 crc kubenswrapper[4774]: E1003 14:59:19.745574 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 03 14:59:19 crc kubenswrapper[4774]: E1003 14:59:19.746005 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xgd8k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-h8wlr_openstack(ea92e131-6368-496f-b7ef-3affd118bb5a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 14:59:19 crc kubenswrapper[4774]: E1003 14:59:19.748236 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 03 14:59:19 crc kubenswrapper[4774]: E1003 14:59:19.748436 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-59r49,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-xfhvv_openstack(fd4951b7-0996-49e5-9e11-b096eea903b5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 14:59:19 crc kubenswrapper[4774]: E1003 14:59:19.748558 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-h8wlr" podUID="ea92e131-6368-496f-b7ef-3affd118bb5a" Oct 03 14:59:19 crc kubenswrapper[4774]: E1003 14:59:19.750616 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-xfhvv" podUID="fd4951b7-0996-49e5-9e11-b096eea903b5" Oct 03 14:59:19 crc kubenswrapper[4774]: E1003 14:59:19.821051 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 03 14:59:19 crc kubenswrapper[4774]: E1003 14:59:19.821188 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r5446,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-cmtxq_openstack(b91a9c71-b83c-4e14-957f-820cba700771): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 14:59:19 crc kubenswrapper[4774]: E1003 14:59:19.822386 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-cmtxq" podUID="b91a9c71-b83c-4e14-957f-820cba700771" Oct 03 14:59:19 crc kubenswrapper[4774]: E1003 14:59:19.956850 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 03 14:59:19 crc kubenswrapper[4774]: E1003 14:59:19.957394 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q5sqf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-tx545_openstack(f3d2fe24-39c4-492e-a220-d46d06b1b587): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 14:59:19 crc kubenswrapper[4774]: E1003 14:59:19.958449 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-tx545" podUID="f3d2fe24-39c4-492e-a220-d46d06b1b587" Oct 03 14:59:20 crc kubenswrapper[4774]: E1003 14:59:20.037816 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-h8wlr" podUID="ea92e131-6368-496f-b7ef-3affd118bb5a" Oct 03 14:59:20 crc kubenswrapper[4774]: E1003 14:59:20.037873 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-cmtxq" podUID="b91a9c71-b83c-4e14-957f-820cba700771" Oct 03 14:59:20 crc kubenswrapper[4774]: I1003 14:59:20.215077 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-4wgb7"] Oct 03 14:59:20 crc kubenswrapper[4774]: I1003 14:59:20.358705 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 14:59:20 crc kubenswrapper[4774]: I1003 14:59:20.445577 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 14:59:20 crc kubenswrapper[4774]: W1003 14:59:20.581684 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc880318a_5ff5_46f8_aca9_134c52ed3ad1.slice/crio-cc2867f2e2c4aa0941d1739a82d9bec7321c98b21120a5e7bee95cdbefc49cc6 WatchSource:0}: Error finding container cc2867f2e2c4aa0941d1739a82d9bec7321c98b21120a5e7bee95cdbefc49cc6: Status 404 returned error can't find the container with id cc2867f2e2c4aa0941d1739a82d9bec7321c98b21120a5e7bee95cdbefc49cc6 Oct 03 14:59:20 crc kubenswrapper[4774]: I1003 14:59:20.589527 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-tx545" Oct 03 14:59:20 crc kubenswrapper[4774]: I1003 14:59:20.599143 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-xfhvv" Oct 03 14:59:20 crc kubenswrapper[4774]: I1003 14:59:20.653862 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:59:20 crc kubenswrapper[4774]: I1003 14:59:20.653919 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:59:20 crc kubenswrapper[4774]: I1003 14:59:20.699140 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59r49\" (UniqueName: \"kubernetes.io/projected/fd4951b7-0996-49e5-9e11-b096eea903b5-kube-api-access-59r49\") pod \"fd4951b7-0996-49e5-9e11-b096eea903b5\" (UID: \"fd4951b7-0996-49e5-9e11-b096eea903b5\") " Oct 03 14:59:20 crc kubenswrapper[4774]: I1003 14:59:20.699209 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd4951b7-0996-49e5-9e11-b096eea903b5-dns-svc\") pod \"fd4951b7-0996-49e5-9e11-b096eea903b5\" (UID: \"fd4951b7-0996-49e5-9e11-b096eea903b5\") " Oct 03 14:59:20 crc kubenswrapper[4774]: I1003 14:59:20.699314 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3d2fe24-39c4-492e-a220-d46d06b1b587-config\") pod \"f3d2fe24-39c4-492e-a220-d46d06b1b587\" (UID: \"f3d2fe24-39c4-492e-a220-d46d06b1b587\") " Oct 03 14:59:20 crc kubenswrapper[4774]: I1003 14:59:20.699349 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5sqf\" (UniqueName: \"kubernetes.io/projected/f3d2fe24-39c4-492e-a220-d46d06b1b587-kube-api-access-q5sqf\") pod \"f3d2fe24-39c4-492e-a220-d46d06b1b587\" (UID: \"f3d2fe24-39c4-492e-a220-d46d06b1b587\") " Oct 03 14:59:20 crc kubenswrapper[4774]: I1003 14:59:20.699416 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd4951b7-0996-49e5-9e11-b096eea903b5-config\") pod \"fd4951b7-0996-49e5-9e11-b096eea903b5\" (UID: \"fd4951b7-0996-49e5-9e11-b096eea903b5\") " Oct 03 14:59:20 crc kubenswrapper[4774]: I1003 14:59:20.699691 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd4951b7-0996-49e5-9e11-b096eea903b5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fd4951b7-0996-49e5-9e11-b096eea903b5" (UID: "fd4951b7-0996-49e5-9e11-b096eea903b5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:59:20 crc kubenswrapper[4774]: I1003 14:59:20.699762 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3d2fe24-39c4-492e-a220-d46d06b1b587-config" (OuterVolumeSpecName: "config") pod "f3d2fe24-39c4-492e-a220-d46d06b1b587" (UID: "f3d2fe24-39c4-492e-a220-d46d06b1b587"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:59:20 crc kubenswrapper[4774]: I1003 14:59:20.699993 4774 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd4951b7-0996-49e5-9e11-b096eea903b5-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:59:20 crc kubenswrapper[4774]: I1003 14:59:20.700016 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3d2fe24-39c4-492e-a220-d46d06b1b587-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:59:20 crc kubenswrapper[4774]: I1003 14:59:20.701205 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd4951b7-0996-49e5-9e11-b096eea903b5-config" (OuterVolumeSpecName: "config") pod "fd4951b7-0996-49e5-9e11-b096eea903b5" (UID: "fd4951b7-0996-49e5-9e11-b096eea903b5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:59:20 crc kubenswrapper[4774]: I1003 14:59:20.704653 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd4951b7-0996-49e5-9e11-b096eea903b5-kube-api-access-59r49" (OuterVolumeSpecName: "kube-api-access-59r49") pod "fd4951b7-0996-49e5-9e11-b096eea903b5" (UID: "fd4951b7-0996-49e5-9e11-b096eea903b5"). InnerVolumeSpecName "kube-api-access-59r49". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:59:20 crc kubenswrapper[4774]: I1003 14:59:20.706428 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3d2fe24-39c4-492e-a220-d46d06b1b587-kube-api-access-q5sqf" (OuterVolumeSpecName: "kube-api-access-q5sqf") pod "f3d2fe24-39c4-492e-a220-d46d06b1b587" (UID: "f3d2fe24-39c4-492e-a220-d46d06b1b587"). InnerVolumeSpecName "kube-api-access-q5sqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:59:20 crc kubenswrapper[4774]: I1003 14:59:20.801170 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5sqf\" (UniqueName: \"kubernetes.io/projected/f3d2fe24-39c4-492e-a220-d46d06b1b587-kube-api-access-q5sqf\") on node \"crc\" DevicePath \"\"" Oct 03 14:59:20 crc kubenswrapper[4774]: I1003 14:59:20.801540 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd4951b7-0996-49e5-9e11-b096eea903b5-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:59:20 crc kubenswrapper[4774]: I1003 14:59:20.801555 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59r49\" (UniqueName: \"kubernetes.io/projected/fd4951b7-0996-49e5-9e11-b096eea903b5-kube-api-access-59r49\") on node \"crc\" DevicePath \"\"" Oct 03 14:59:21 crc kubenswrapper[4774]: I1003 14:59:21.041657 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66","Type":"ContainerStarted","Data":"3a32190c0e7ef07f6e099d80aa511f7cf8033e958aec5a91a470f31348339d56"} Oct 03 14:59:21 crc kubenswrapper[4774]: I1003 14:59:21.043574 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a5455d9b-4489-4041-b44d-990124dd84e4","Type":"ContainerStarted","Data":"47df8cec5731e464746469c0881c5c7cc79c0e72e43bcbce84b93172eeea4dbf"} Oct 03 14:59:21 crc kubenswrapper[4774]: I1003 14:59:21.046169 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c880318a-5ff5-46f8-aca9-134c52ed3ad1","Type":"ContainerStarted","Data":"cc2867f2e2c4aa0941d1739a82d9bec7321c98b21120a5e7bee95cdbefc49cc6"} Oct 03 14:59:21 crc kubenswrapper[4774]: I1003 14:59:21.047586 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"27254696-8788-47fe-a9a4-208fd295e427","Type":"ContainerStarted","Data":"924121d1ba26c7434dd8cdae19f636eb00c3035d5cb792352eb70234f56fc364"} Oct 03 14:59:21 crc kubenswrapper[4774]: I1003 14:59:21.048582 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-xfhvv" event={"ID":"fd4951b7-0996-49e5-9e11-b096eea903b5","Type":"ContainerDied","Data":"5fe04dce08204570773d31601b1ed982973e155bb34b08f630d8c8befb5d55fc"} Oct 03 14:59:21 crc kubenswrapper[4774]: I1003 14:59:21.048644 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-xfhvv" Oct 03 14:59:21 crc kubenswrapper[4774]: I1003 14:59:21.050486 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4wgb7" event={"ID":"b9111154-59d2-4b07-b8c3-db1870883cde","Type":"ContainerStarted","Data":"b9a49243e77a1bbabaeaf55cd4ea19760683ce9b9e07b17d386b50ac83bee1bc"} Oct 03 14:59:21 crc kubenswrapper[4774]: I1003 14:59:21.051724 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-tx545" Oct 03 14:59:21 crc kubenswrapper[4774]: I1003 14:59:21.051737 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-tx545" event={"ID":"f3d2fe24-39c4-492e-a220-d46d06b1b587","Type":"ContainerDied","Data":"b80de6ed46de55a7e237a3e75653f3e4b8a6910f5186b4209084e8a32c99cc13"} Oct 03 14:59:21 crc kubenswrapper[4774]: I1003 14:59:21.052869 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"09034f5f-3011-4604-8b05-f8a3fef6a74a","Type":"ContainerStarted","Data":"8e9e3b197bb4ce4fd00c23c477e4a1569dec7d044c4bd958222c9e47b9107aa3"} Oct 03 14:59:21 crc kubenswrapper[4774]: I1003 14:59:21.053858 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 03 14:59:21 crc kubenswrapper[4774]: I1003 14:59:21.084101 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=3.011741717 podStartE2EDuration="25.084078944s" podCreationTimestamp="2025-10-03 14:58:56 +0000 UTC" firstStartedPulling="2025-10-03 14:58:57.59770855 +0000 UTC m=+960.186912002" lastFinishedPulling="2025-10-03 14:59:19.670045757 +0000 UTC m=+982.259249229" observedRunningTime="2025-10-03 14:59:21.080560126 +0000 UTC m=+983.669763588" watchObservedRunningTime="2025-10-03 14:59:21.084078944 +0000 UTC m=+983.673282396" Oct 03 14:59:21 crc kubenswrapper[4774]: I1003 14:59:21.164253 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-xfhvv"] Oct 03 14:59:21 crc kubenswrapper[4774]: I1003 14:59:21.171560 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-xfhvv"] Oct 03 14:59:21 crc kubenswrapper[4774]: I1003 14:59:21.187386 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-tx545"] Oct 03 14:59:21 crc kubenswrapper[4774]: I1003 14:59:21.194413 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-tx545"] Oct 03 14:59:21 crc kubenswrapper[4774]: I1003 14:59:21.239790 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-bhmcl"] Oct 03 14:59:21 crc kubenswrapper[4774]: I1003 14:59:21.308289 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3d2fe24-39c4-492e-a220-d46d06b1b587" path="/var/lib/kubelet/pods/f3d2fe24-39c4-492e-a220-d46d06b1b587/volumes" Oct 03 14:59:21 crc kubenswrapper[4774]: I1003 14:59:21.308666 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd4951b7-0996-49e5-9e11-b096eea903b5" path="/var/lib/kubelet/pods/fd4951b7-0996-49e5-9e11-b096eea903b5/volumes" Oct 03 14:59:21 crc kubenswrapper[4774]: W1003 14:59:21.465726 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03ee58bb_cd78_4fdb_986f_a9b60f9998e8.slice/crio-0d4ef64eed0db4bc833fe3954f389565059c46b7d78e67fc501309dbd8aeb861 WatchSource:0}: Error finding container 0d4ef64eed0db4bc833fe3954f389565059c46b7d78e67fc501309dbd8aeb861: Status 404 returned error can't find the container with id 0d4ef64eed0db4bc833fe3954f389565059c46b7d78e67fc501309dbd8aeb861 Oct 03 14:59:22 crc kubenswrapper[4774]: I1003 14:59:22.061903 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bhmcl" event={"ID":"03ee58bb-cd78-4fdb-986f-a9b60f9998e8","Type":"ContainerStarted","Data":"0d4ef64eed0db4bc833fe3954f389565059c46b7d78e67fc501309dbd8aeb861"} Oct 03 14:59:27 crc kubenswrapper[4774]: I1003 14:59:27.030611 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 03 14:59:28 crc kubenswrapper[4774]: I1003 14:59:28.676803 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-h8wlr"] Oct 03 14:59:28 crc kubenswrapper[4774]: I1003 14:59:28.728778 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-jwmwm"] Oct 03 14:59:28 crc kubenswrapper[4774]: I1003 14:59:28.730713 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-jwmwm" Oct 03 14:59:28 crc kubenswrapper[4774]: I1003 14:59:28.742939 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-jwmwm"] Oct 03 14:59:28 crc kubenswrapper[4774]: I1003 14:59:28.847180 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/169fd512-6192-4597-a89b-cf9b0c6b76b1-config\") pod \"dnsmasq-dns-7cb5889db5-jwmwm\" (UID: \"169fd512-6192-4597-a89b-cf9b0c6b76b1\") " pod="openstack/dnsmasq-dns-7cb5889db5-jwmwm" Oct 03 14:59:28 crc kubenswrapper[4774]: I1003 14:59:28.847261 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rglrs\" (UniqueName: \"kubernetes.io/projected/169fd512-6192-4597-a89b-cf9b0c6b76b1-kube-api-access-rglrs\") pod \"dnsmasq-dns-7cb5889db5-jwmwm\" (UID: \"169fd512-6192-4597-a89b-cf9b0c6b76b1\") " pod="openstack/dnsmasq-dns-7cb5889db5-jwmwm" Oct 03 14:59:28 crc kubenswrapper[4774]: I1003 14:59:28.847299 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/169fd512-6192-4597-a89b-cf9b0c6b76b1-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-jwmwm\" (UID: \"169fd512-6192-4597-a89b-cf9b0c6b76b1\") " pod="openstack/dnsmasq-dns-7cb5889db5-jwmwm" Oct 03 14:59:28 crc kubenswrapper[4774]: I1003 14:59:28.950272 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/169fd512-6192-4597-a89b-cf9b0c6b76b1-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-jwmwm\" (UID: \"169fd512-6192-4597-a89b-cf9b0c6b76b1\") " pod="openstack/dnsmasq-dns-7cb5889db5-jwmwm" Oct 03 14:59:28 crc kubenswrapper[4774]: I1003 14:59:28.950402 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/169fd512-6192-4597-a89b-cf9b0c6b76b1-config\") pod \"dnsmasq-dns-7cb5889db5-jwmwm\" (UID: \"169fd512-6192-4597-a89b-cf9b0c6b76b1\") " pod="openstack/dnsmasq-dns-7cb5889db5-jwmwm" Oct 03 14:59:28 crc kubenswrapper[4774]: I1003 14:59:28.950647 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rglrs\" (UniqueName: \"kubernetes.io/projected/169fd512-6192-4597-a89b-cf9b0c6b76b1-kube-api-access-rglrs\") pod \"dnsmasq-dns-7cb5889db5-jwmwm\" (UID: \"169fd512-6192-4597-a89b-cf9b0c6b76b1\") " pod="openstack/dnsmasq-dns-7cb5889db5-jwmwm" Oct 03 14:59:28 crc kubenswrapper[4774]: I1003 14:59:28.951898 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/169fd512-6192-4597-a89b-cf9b0c6b76b1-config\") pod \"dnsmasq-dns-7cb5889db5-jwmwm\" (UID: \"169fd512-6192-4597-a89b-cf9b0c6b76b1\") " pod="openstack/dnsmasq-dns-7cb5889db5-jwmwm" Oct 03 14:59:28 crc kubenswrapper[4774]: I1003 14:59:28.951950 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/169fd512-6192-4597-a89b-cf9b0c6b76b1-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-jwmwm\" (UID: \"169fd512-6192-4597-a89b-cf9b0c6b76b1\") " pod="openstack/dnsmasq-dns-7cb5889db5-jwmwm" Oct 03 14:59:28 crc kubenswrapper[4774]: I1003 14:59:28.991221 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rglrs\" (UniqueName: \"kubernetes.io/projected/169fd512-6192-4597-a89b-cf9b0c6b76b1-kube-api-access-rglrs\") pod \"dnsmasq-dns-7cb5889db5-jwmwm\" (UID: \"169fd512-6192-4597-a89b-cf9b0c6b76b1\") " pod="openstack/dnsmasq-dns-7cb5889db5-jwmwm" Oct 03 14:59:29 crc kubenswrapper[4774]: I1003 14:59:29.065046 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-jwmwm" Oct 03 14:59:29 crc kubenswrapper[4774]: I1003 14:59:29.236937 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-h8wlr" Oct 03 14:59:29 crc kubenswrapper[4774]: I1003 14:59:29.371658 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea92e131-6368-496f-b7ef-3affd118bb5a-dns-svc\") pod \"ea92e131-6368-496f-b7ef-3affd118bb5a\" (UID: \"ea92e131-6368-496f-b7ef-3affd118bb5a\") " Oct 03 14:59:29 crc kubenswrapper[4774]: I1003 14:59:29.372072 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgd8k\" (UniqueName: \"kubernetes.io/projected/ea92e131-6368-496f-b7ef-3affd118bb5a-kube-api-access-xgd8k\") pod \"ea92e131-6368-496f-b7ef-3affd118bb5a\" (UID: \"ea92e131-6368-496f-b7ef-3affd118bb5a\") " Oct 03 14:59:29 crc kubenswrapper[4774]: I1003 14:59:29.372134 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea92e131-6368-496f-b7ef-3affd118bb5a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ea92e131-6368-496f-b7ef-3affd118bb5a" (UID: "ea92e131-6368-496f-b7ef-3affd118bb5a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:59:29 crc kubenswrapper[4774]: I1003 14:59:29.372184 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea92e131-6368-496f-b7ef-3affd118bb5a-config\") pod \"ea92e131-6368-496f-b7ef-3affd118bb5a\" (UID: \"ea92e131-6368-496f-b7ef-3affd118bb5a\") " Oct 03 14:59:29 crc kubenswrapper[4774]: I1003 14:59:29.372620 4774 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea92e131-6368-496f-b7ef-3affd118bb5a-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:59:29 crc kubenswrapper[4774]: I1003 14:59:29.372664 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea92e131-6368-496f-b7ef-3affd118bb5a-config" (OuterVolumeSpecName: "config") pod "ea92e131-6368-496f-b7ef-3affd118bb5a" (UID: "ea92e131-6368-496f-b7ef-3affd118bb5a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:59:29 crc kubenswrapper[4774]: I1003 14:59:29.376440 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea92e131-6368-496f-b7ef-3affd118bb5a-kube-api-access-xgd8k" (OuterVolumeSpecName: "kube-api-access-xgd8k") pod "ea92e131-6368-496f-b7ef-3affd118bb5a" (UID: "ea92e131-6368-496f-b7ef-3affd118bb5a"). InnerVolumeSpecName "kube-api-access-xgd8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:59:29 crc kubenswrapper[4774]: I1003 14:59:29.385384 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-jwmwm"] Oct 03 14:59:29 crc kubenswrapper[4774]: I1003 14:59:29.474324 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgd8k\" (UniqueName: \"kubernetes.io/projected/ea92e131-6368-496f-b7ef-3affd118bb5a-kube-api-access-xgd8k\") on node \"crc\" DevicePath \"\"" Oct 03 14:59:29 crc kubenswrapper[4774]: I1003 14:59:29.474349 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea92e131-6368-496f-b7ef-3affd118bb5a-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:59:29 crc kubenswrapper[4774]: I1003 14:59:29.879447 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 03 14:59:29 crc kubenswrapper[4774]: I1003 14:59:29.901258 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 03 14:59:29 crc kubenswrapper[4774]: I1003 14:59:29.906062 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 03 14:59:29 crc kubenswrapper[4774]: I1003 14:59:29.908545 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 03 14:59:29 crc kubenswrapper[4774]: I1003 14:59:29.908801 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-b6csw" Oct 03 14:59:29 crc kubenswrapper[4774]: I1003 14:59:29.909676 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 03 14:59:29 crc kubenswrapper[4774]: I1003 14:59:29.924405 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 03 14:59:30 crc kubenswrapper[4774]: I1003 14:59:30.084436 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"bc0b2c39-8c1e-4401-97f6-a4306b435436\") " pod="openstack/swift-storage-0" Oct 03 14:59:30 crc kubenswrapper[4774]: I1003 14:59:30.084490 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/bc0b2c39-8c1e-4401-97f6-a4306b435436-lock\") pod \"swift-storage-0\" (UID: \"bc0b2c39-8c1e-4401-97f6-a4306b435436\") " pod="openstack/swift-storage-0" Oct 03 14:59:30 crc kubenswrapper[4774]: I1003 14:59:30.084515 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/bc0b2c39-8c1e-4401-97f6-a4306b435436-cache\") pod \"swift-storage-0\" (UID: \"bc0b2c39-8c1e-4401-97f6-a4306b435436\") " pod="openstack/swift-storage-0" Oct 03 14:59:30 crc kubenswrapper[4774]: I1003 14:59:30.084569 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6tw8\" (UniqueName: \"kubernetes.io/projected/bc0b2c39-8c1e-4401-97f6-a4306b435436-kube-api-access-b6tw8\") pod \"swift-storage-0\" (UID: \"bc0b2c39-8c1e-4401-97f6-a4306b435436\") " pod="openstack/swift-storage-0" Oct 03 14:59:30 crc kubenswrapper[4774]: I1003 14:59:30.084676 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bc0b2c39-8c1e-4401-97f6-a4306b435436-etc-swift\") pod \"swift-storage-0\" (UID: \"bc0b2c39-8c1e-4401-97f6-a4306b435436\") " pod="openstack/swift-storage-0" Oct 03 14:59:30 crc kubenswrapper[4774]: I1003 14:59:30.136000 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-jwmwm" event={"ID":"169fd512-6192-4597-a89b-cf9b0c6b76b1","Type":"ContainerStarted","Data":"79e5d50e2d3608cc6fb8d213e5516485f1375afd983aa6042942730ac2b876df"} Oct 03 14:59:30 crc kubenswrapper[4774]: I1003 14:59:30.137618 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4wgb7" event={"ID":"b9111154-59d2-4b07-b8c3-db1870883cde","Type":"ContainerStarted","Data":"08aadb70ddf64aa6cd565539aceb93c2bc3701ec017ceac59f30a043b1d2d859"} Oct 03 14:59:30 crc kubenswrapper[4774]: I1003 14:59:30.138253 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-4wgb7" Oct 03 14:59:30 crc kubenswrapper[4774]: I1003 14:59:30.139850 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-h8wlr" event={"ID":"ea92e131-6368-496f-b7ef-3affd118bb5a","Type":"ContainerDied","Data":"a368f40bda172bcf0d79cf416ea332c24cfec7a4089cf2c810a08eafa1925d0d"} Oct 03 14:59:30 crc kubenswrapper[4774]: I1003 14:59:30.139913 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-h8wlr" Oct 03 14:59:30 crc kubenswrapper[4774]: I1003 14:59:30.141277 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66","Type":"ContainerStarted","Data":"902a77a8a7c06128ff297ae1a6633574dfd78ba3d568000a6ecc4e4476981610"} Oct 03 14:59:30 crc kubenswrapper[4774]: I1003 14:59:30.143143 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bhmcl" event={"ID":"03ee58bb-cd78-4fdb-986f-a9b60f9998e8","Type":"ContainerStarted","Data":"d2cdab458fc999b38512afd12eafb5014a5de9bd66dfb1825debf7b4774db2e5"} Oct 03 14:59:30 crc kubenswrapper[4774]: I1003 14:59:30.145009 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"dc5407ed-2e7c-4f5e-b24a-f040659e71f1","Type":"ContainerStarted","Data":"80936cb28db012f2bafbf7c939b2e0b63ff5591ff3ad497d90dc6ef19897378c"} Oct 03 14:59:30 crc kubenswrapper[4774]: I1003 14:59:30.145091 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 03 14:59:30 crc kubenswrapper[4774]: I1003 14:59:30.148854 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c880318a-5ff5-46f8-aca9-134c52ed3ad1","Type":"ContainerStarted","Data":"776b29e66f89006610427cec26d3f4e54cf8bbe71ffa30d074177afd589e26c8"} Oct 03 14:59:30 crc kubenswrapper[4774]: I1003 14:59:30.161700 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-4wgb7" podStartSLOduration=19.157519853 podStartE2EDuration="28.161681595s" podCreationTimestamp="2025-10-03 14:59:02 +0000 UTC" firstStartedPulling="2025-10-03 14:59:20.231611787 +0000 UTC m=+982.820815239" lastFinishedPulling="2025-10-03 14:59:29.235773529 +0000 UTC m=+991.824976981" observedRunningTime="2025-10-03 14:59:30.159239964 +0000 UTC m=+992.748443416" watchObservedRunningTime="2025-10-03 14:59:30.161681595 +0000 UTC m=+992.750885047" Oct 03 14:59:30 crc kubenswrapper[4774]: I1003 14:59:30.181763 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.297463296 podStartE2EDuration="32.181746615s" podCreationTimestamp="2025-10-03 14:58:58 +0000 UTC" firstStartedPulling="2025-10-03 14:58:59.269859797 +0000 UTC m=+961.859063249" lastFinishedPulling="2025-10-03 14:59:29.154143116 +0000 UTC m=+991.743346568" observedRunningTime="2025-10-03 14:59:30.173436818 +0000 UTC m=+992.762640270" watchObservedRunningTime="2025-10-03 14:59:30.181746615 +0000 UTC m=+992.770950067" Oct 03 14:59:30 crc kubenswrapper[4774]: I1003 14:59:30.186058 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"bc0b2c39-8c1e-4401-97f6-a4306b435436\") " pod="openstack/swift-storage-0" Oct 03 14:59:30 crc kubenswrapper[4774]: I1003 14:59:30.186099 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/bc0b2c39-8c1e-4401-97f6-a4306b435436-lock\") pod \"swift-storage-0\" (UID: \"bc0b2c39-8c1e-4401-97f6-a4306b435436\") " pod="openstack/swift-storage-0" Oct 03 14:59:30 crc kubenswrapper[4774]: I1003 14:59:30.186117 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/bc0b2c39-8c1e-4401-97f6-a4306b435436-cache\") pod \"swift-storage-0\" (UID: \"bc0b2c39-8c1e-4401-97f6-a4306b435436\") " pod="openstack/swift-storage-0" Oct 03 14:59:30 crc kubenswrapper[4774]: I1003 14:59:30.186140 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6tw8\" (UniqueName: \"kubernetes.io/projected/bc0b2c39-8c1e-4401-97f6-a4306b435436-kube-api-access-b6tw8\") pod \"swift-storage-0\" (UID: \"bc0b2c39-8c1e-4401-97f6-a4306b435436\") " pod="openstack/swift-storage-0" Oct 03 14:59:30 crc kubenswrapper[4774]: I1003 14:59:30.186171 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bc0b2c39-8c1e-4401-97f6-a4306b435436-etc-swift\") pod \"swift-storage-0\" (UID: \"bc0b2c39-8c1e-4401-97f6-a4306b435436\") " pod="openstack/swift-storage-0" Oct 03 14:59:30 crc kubenswrapper[4774]: E1003 14:59:30.186293 4774 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 14:59:30 crc kubenswrapper[4774]: E1003 14:59:30.186311 4774 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 14:59:30 crc kubenswrapper[4774]: E1003 14:59:30.186353 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc0b2c39-8c1e-4401-97f6-a4306b435436-etc-swift podName:bc0b2c39-8c1e-4401-97f6-a4306b435436 nodeName:}" failed. No retries permitted until 2025-10-03 14:59:30.686337669 +0000 UTC m=+993.275541121 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bc0b2c39-8c1e-4401-97f6-a4306b435436-etc-swift") pod "swift-storage-0" (UID: "bc0b2c39-8c1e-4401-97f6-a4306b435436") : configmap "swift-ring-files" not found Oct 03 14:59:30 crc kubenswrapper[4774]: I1003 14:59:30.186861 4774 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"bc0b2c39-8c1e-4401-97f6-a4306b435436\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/swift-storage-0" Oct 03 14:59:30 crc kubenswrapper[4774]: I1003 14:59:30.192780 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/bc0b2c39-8c1e-4401-97f6-a4306b435436-cache\") pod \"swift-storage-0\" (UID: \"bc0b2c39-8c1e-4401-97f6-a4306b435436\") " pod="openstack/swift-storage-0" Oct 03 14:59:30 crc kubenswrapper[4774]: I1003 14:59:30.192830 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/bc0b2c39-8c1e-4401-97f6-a4306b435436-lock\") pod \"swift-storage-0\" (UID: \"bc0b2c39-8c1e-4401-97f6-a4306b435436\") " pod="openstack/swift-storage-0" Oct 03 14:59:30 crc kubenswrapper[4774]: I1003 14:59:30.211398 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6tw8\" (UniqueName: \"kubernetes.io/projected/bc0b2c39-8c1e-4401-97f6-a4306b435436-kube-api-access-b6tw8\") pod \"swift-storage-0\" (UID: \"bc0b2c39-8c1e-4401-97f6-a4306b435436\") " pod="openstack/swift-storage-0" Oct 03 14:59:30 crc kubenswrapper[4774]: I1003 14:59:30.268670 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"bc0b2c39-8c1e-4401-97f6-a4306b435436\") " pod="openstack/swift-storage-0" Oct 03 14:59:30 crc kubenswrapper[4774]: I1003 14:59:30.444463 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-h8wlr"] Oct 03 14:59:30 crc kubenswrapper[4774]: I1003 14:59:30.449486 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-h8wlr"] Oct 03 14:59:30 crc kubenswrapper[4774]: I1003 14:59:30.697325 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bc0b2c39-8c1e-4401-97f6-a4306b435436-etc-swift\") pod \"swift-storage-0\" (UID: \"bc0b2c39-8c1e-4401-97f6-a4306b435436\") " pod="openstack/swift-storage-0" Oct 03 14:59:30 crc kubenswrapper[4774]: E1003 14:59:30.697739 4774 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 14:59:30 crc kubenswrapper[4774]: E1003 14:59:30.697755 4774 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 14:59:30 crc kubenswrapper[4774]: E1003 14:59:30.697795 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc0b2c39-8c1e-4401-97f6-a4306b435436-etc-swift podName:bc0b2c39-8c1e-4401-97f6-a4306b435436 nodeName:}" failed. No retries permitted until 2025-10-03 14:59:31.69778123 +0000 UTC m=+994.286984682 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bc0b2c39-8c1e-4401-97f6-a4306b435436-etc-swift") pod "swift-storage-0" (UID: "bc0b2c39-8c1e-4401-97f6-a4306b435436") : configmap "swift-ring-files" not found Oct 03 14:59:31 crc kubenswrapper[4774]: I1003 14:59:31.163574 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7fa97e79-a30c-4722-b02b-ec5494bd057c","Type":"ContainerStarted","Data":"ba4f05d232c8d413350b06f66392f7b2e2403d6d44ba76999902b2b487809df3"} Oct 03 14:59:31 crc kubenswrapper[4774]: I1003 14:59:31.165438 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0a0a516a-bd97-4484-802b-71eb14f3ca3f","Type":"ContainerStarted","Data":"bf3375d50c95a1b4c96eaea4067faf1aa3e5e031209fd4b5cd965560980c24ae"} Oct 03 14:59:31 crc kubenswrapper[4774]: I1003 14:59:31.169766 4774 generic.go:334] "Generic (PLEG): container finished" podID="169fd512-6192-4597-a89b-cf9b0c6b76b1" containerID="434c2a1209f9be7e0d01cfc00c482daf1bf212471541c2a64d486a17aff19d55" exitCode=0 Oct 03 14:59:31 crc kubenswrapper[4774]: I1003 14:59:31.169884 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-jwmwm" event={"ID":"169fd512-6192-4597-a89b-cf9b0c6b76b1","Type":"ContainerDied","Data":"434c2a1209f9be7e0d01cfc00c482daf1bf212471541c2a64d486a17aff19d55"} Oct 03 14:59:31 crc kubenswrapper[4774]: I1003 14:59:31.175055 4774 generic.go:334] "Generic (PLEG): container finished" podID="03ee58bb-cd78-4fdb-986f-a9b60f9998e8" containerID="d2cdab458fc999b38512afd12eafb5014a5de9bd66dfb1825debf7b4774db2e5" exitCode=0 Oct 03 14:59:31 crc kubenswrapper[4774]: I1003 14:59:31.176755 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bhmcl" event={"ID":"03ee58bb-cd78-4fdb-986f-a9b60f9998e8","Type":"ContainerDied","Data":"d2cdab458fc999b38512afd12eafb5014a5de9bd66dfb1825debf7b4774db2e5"} Oct 03 14:59:31 crc kubenswrapper[4774]: I1003 14:59:31.313358 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea92e131-6368-496f-b7ef-3affd118bb5a" path="/var/lib/kubelet/pods/ea92e131-6368-496f-b7ef-3affd118bb5a/volumes" Oct 03 14:59:31 crc kubenswrapper[4774]: I1003 14:59:31.715123 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bc0b2c39-8c1e-4401-97f6-a4306b435436-etc-swift\") pod \"swift-storage-0\" (UID: \"bc0b2c39-8c1e-4401-97f6-a4306b435436\") " pod="openstack/swift-storage-0" Oct 03 14:59:31 crc kubenswrapper[4774]: E1003 14:59:31.715318 4774 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 14:59:31 crc kubenswrapper[4774]: E1003 14:59:31.715502 4774 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 14:59:31 crc kubenswrapper[4774]: E1003 14:59:31.715575 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc0b2c39-8c1e-4401-97f6-a4306b435436-etc-swift podName:bc0b2c39-8c1e-4401-97f6-a4306b435436 nodeName:}" failed. No retries permitted until 2025-10-03 14:59:33.715552335 +0000 UTC m=+996.304755787 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bc0b2c39-8c1e-4401-97f6-a4306b435436-etc-swift") pod "swift-storage-0" (UID: "bc0b2c39-8c1e-4401-97f6-a4306b435436") : configmap "swift-ring-files" not found Oct 03 14:59:32 crc kubenswrapper[4774]: I1003 14:59:32.195755 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bhmcl" event={"ID":"03ee58bb-cd78-4fdb-986f-a9b60f9998e8","Type":"ContainerStarted","Data":"b1a77ab8ef4588d5254e7e16527591a52637d82a45557fa1e13dd8e91e287f24"} Oct 03 14:59:32 crc kubenswrapper[4774]: I1003 14:59:32.198758 4774 generic.go:334] "Generic (PLEG): container finished" podID="a5455d9b-4489-4041-b44d-990124dd84e4" containerID="47df8cec5731e464746469c0881c5c7cc79c0e72e43bcbce84b93172eeea4dbf" exitCode=0 Oct 03 14:59:32 crc kubenswrapper[4774]: I1003 14:59:32.198822 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a5455d9b-4489-4041-b44d-990124dd84e4","Type":"ContainerDied","Data":"47df8cec5731e464746469c0881c5c7cc79c0e72e43bcbce84b93172eeea4dbf"} Oct 03 14:59:32 crc kubenswrapper[4774]: I1003 14:59:32.201300 4774 generic.go:334] "Generic (PLEG): container finished" podID="27254696-8788-47fe-a9a4-208fd295e427" containerID="924121d1ba26c7434dd8cdae19f636eb00c3035d5cb792352eb70234f56fc364" exitCode=0 Oct 03 14:59:32 crc kubenswrapper[4774]: I1003 14:59:32.201365 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"27254696-8788-47fe-a9a4-208fd295e427","Type":"ContainerDied","Data":"924121d1ba26c7434dd8cdae19f636eb00c3035d5cb792352eb70234f56fc364"} Oct 03 14:59:32 crc kubenswrapper[4774]: I1003 14:59:32.203273 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-jwmwm" event={"ID":"169fd512-6192-4597-a89b-cf9b0c6b76b1","Type":"ContainerStarted","Data":"98189ad2c12124c7a82be7808867d5919631c27e2d5943c5e2d1f6d573c52d7f"} Oct 03 14:59:32 crc kubenswrapper[4774]: I1003 14:59:32.203579 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7cb5889db5-jwmwm" Oct 03 14:59:32 crc kubenswrapper[4774]: I1003 14:59:32.268307 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7cb5889db5-jwmwm" podStartSLOduration=3.556089412 podStartE2EDuration="4.268283925s" podCreationTimestamp="2025-10-03 14:59:28 +0000 UTC" firstStartedPulling="2025-10-03 14:59:29.409138388 +0000 UTC m=+991.998341840" lastFinishedPulling="2025-10-03 14:59:30.121332901 +0000 UTC m=+992.710536353" observedRunningTime="2025-10-03 14:59:32.263741672 +0000 UTC m=+994.852945114" watchObservedRunningTime="2025-10-03 14:59:32.268283925 +0000 UTC m=+994.857487377" Oct 03 14:59:33 crc kubenswrapper[4774]: E1003 14:59:33.567582 4774 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Oct 03 14:59:33 crc kubenswrapper[4774]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/b91a9c71-b83c-4e14-957f-820cba700771/volume-subpaths/dns-svc/init/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 03 14:59:33 crc kubenswrapper[4774]: > podSandboxID="d58d6367d94351bb56b0ff2e431904b9b441d7c7c4a46b9226a08a7d638883ae" Oct 03 14:59:33 crc kubenswrapper[4774]: E1003 14:59:33.568240 4774 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 03 14:59:33 crc kubenswrapper[4774]: init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r5446,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-cmtxq_openstack(b91a9c71-b83c-4e14-957f-820cba700771): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/b91a9c71-b83c-4e14-957f-820cba700771/volume-subpaths/dns-svc/init/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 03 14:59:33 crc kubenswrapper[4774]: > logger="UnhandledError" Oct 03 14:59:33 crc kubenswrapper[4774]: E1003 14:59:33.569439 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/b91a9c71-b83c-4e14-957f-820cba700771/volume-subpaths/dns-svc/init/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-5ccc8479f9-cmtxq" podUID="b91a9c71-b83c-4e14-957f-820cba700771" Oct 03 14:59:33 crc kubenswrapper[4774]: I1003 14:59:33.748782 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bc0b2c39-8c1e-4401-97f6-a4306b435436-etc-swift\") pod \"swift-storage-0\" (UID: \"bc0b2c39-8c1e-4401-97f6-a4306b435436\") " pod="openstack/swift-storage-0" Oct 03 14:59:33 crc kubenswrapper[4774]: E1003 14:59:33.749008 4774 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 14:59:33 crc kubenswrapper[4774]: E1003 14:59:33.749247 4774 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 14:59:33 crc kubenswrapper[4774]: E1003 14:59:33.749314 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc0b2c39-8c1e-4401-97f6-a4306b435436-etc-swift podName:bc0b2c39-8c1e-4401-97f6-a4306b435436 nodeName:}" failed. No retries permitted until 2025-10-03 14:59:37.74929517 +0000 UTC m=+1000.338498622 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bc0b2c39-8c1e-4401-97f6-a4306b435436-etc-swift") pod "swift-storage-0" (UID: "bc0b2c39-8c1e-4401-97f6-a4306b435436") : configmap "swift-ring-files" not found Oct 03 14:59:33 crc kubenswrapper[4774]: I1003 14:59:33.814804 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-xxj5b"] Oct 03 14:59:33 crc kubenswrapper[4774]: I1003 14:59:33.816200 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xxj5b" Oct 03 14:59:33 crc kubenswrapper[4774]: I1003 14:59:33.818389 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 03 14:59:33 crc kubenswrapper[4774]: I1003 14:59:33.818750 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 03 14:59:33 crc kubenswrapper[4774]: I1003 14:59:33.819455 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 03 14:59:33 crc kubenswrapper[4774]: I1003 14:59:33.823930 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-xxj5b"] Oct 03 14:59:33 crc kubenswrapper[4774]: I1003 14:59:33.952255 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5244bd24-f205-4576-a7cd-6da859f28e21-combined-ca-bundle\") pod \"swift-ring-rebalance-xxj5b\" (UID: \"5244bd24-f205-4576-a7cd-6da859f28e21\") " pod="openstack/swift-ring-rebalance-xxj5b" Oct 03 14:59:33 crc kubenswrapper[4774]: I1003 14:59:33.952355 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5244bd24-f205-4576-a7cd-6da859f28e21-etc-swift\") pod \"swift-ring-rebalance-xxj5b\" (UID: \"5244bd24-f205-4576-a7cd-6da859f28e21\") " pod="openstack/swift-ring-rebalance-xxj5b" Oct 03 14:59:33 crc kubenswrapper[4774]: I1003 14:59:33.952516 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5244bd24-f205-4576-a7cd-6da859f28e21-scripts\") pod \"swift-ring-rebalance-xxj5b\" (UID: \"5244bd24-f205-4576-a7cd-6da859f28e21\") " pod="openstack/swift-ring-rebalance-xxj5b" Oct 03 14:59:33 crc kubenswrapper[4774]: I1003 14:59:33.952578 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5244bd24-f205-4576-a7cd-6da859f28e21-ring-data-devices\") pod \"swift-ring-rebalance-xxj5b\" (UID: \"5244bd24-f205-4576-a7cd-6da859f28e21\") " pod="openstack/swift-ring-rebalance-xxj5b" Oct 03 14:59:33 crc kubenswrapper[4774]: I1003 14:59:33.952721 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5244bd24-f205-4576-a7cd-6da859f28e21-swiftconf\") pod \"swift-ring-rebalance-xxj5b\" (UID: \"5244bd24-f205-4576-a7cd-6da859f28e21\") " pod="openstack/swift-ring-rebalance-xxj5b" Oct 03 14:59:33 crc kubenswrapper[4774]: I1003 14:59:33.952844 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k624m\" (UniqueName: \"kubernetes.io/projected/5244bd24-f205-4576-a7cd-6da859f28e21-kube-api-access-k624m\") pod \"swift-ring-rebalance-xxj5b\" (UID: \"5244bd24-f205-4576-a7cd-6da859f28e21\") " pod="openstack/swift-ring-rebalance-xxj5b" Oct 03 14:59:33 crc kubenswrapper[4774]: I1003 14:59:33.952928 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5244bd24-f205-4576-a7cd-6da859f28e21-dispersionconf\") pod \"swift-ring-rebalance-xxj5b\" (UID: \"5244bd24-f205-4576-a7cd-6da859f28e21\") " pod="openstack/swift-ring-rebalance-xxj5b" Oct 03 14:59:34 crc kubenswrapper[4774]: I1003 14:59:34.054473 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5244bd24-f205-4576-a7cd-6da859f28e21-scripts\") pod \"swift-ring-rebalance-xxj5b\" (UID: \"5244bd24-f205-4576-a7cd-6da859f28e21\") " pod="openstack/swift-ring-rebalance-xxj5b" Oct 03 14:59:34 crc kubenswrapper[4774]: I1003 14:59:34.054560 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5244bd24-f205-4576-a7cd-6da859f28e21-ring-data-devices\") pod \"swift-ring-rebalance-xxj5b\" (UID: \"5244bd24-f205-4576-a7cd-6da859f28e21\") " pod="openstack/swift-ring-rebalance-xxj5b" Oct 03 14:59:34 crc kubenswrapper[4774]: I1003 14:59:34.054666 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5244bd24-f205-4576-a7cd-6da859f28e21-swiftconf\") pod \"swift-ring-rebalance-xxj5b\" (UID: \"5244bd24-f205-4576-a7cd-6da859f28e21\") " pod="openstack/swift-ring-rebalance-xxj5b" Oct 03 14:59:34 crc kubenswrapper[4774]: I1003 14:59:34.054787 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k624m\" (UniqueName: \"kubernetes.io/projected/5244bd24-f205-4576-a7cd-6da859f28e21-kube-api-access-k624m\") pod \"swift-ring-rebalance-xxj5b\" (UID: \"5244bd24-f205-4576-a7cd-6da859f28e21\") " pod="openstack/swift-ring-rebalance-xxj5b" Oct 03 14:59:34 crc kubenswrapper[4774]: I1003 14:59:34.054856 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5244bd24-f205-4576-a7cd-6da859f28e21-dispersionconf\") pod \"swift-ring-rebalance-xxj5b\" (UID: \"5244bd24-f205-4576-a7cd-6da859f28e21\") " pod="openstack/swift-ring-rebalance-xxj5b" Oct 03 14:59:34 crc kubenswrapper[4774]: I1003 14:59:34.054909 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5244bd24-f205-4576-a7cd-6da859f28e21-combined-ca-bundle\") pod \"swift-ring-rebalance-xxj5b\" (UID: \"5244bd24-f205-4576-a7cd-6da859f28e21\") " pod="openstack/swift-ring-rebalance-xxj5b" Oct 03 14:59:34 crc kubenswrapper[4774]: I1003 14:59:34.054956 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5244bd24-f205-4576-a7cd-6da859f28e21-etc-swift\") pod \"swift-ring-rebalance-xxj5b\" (UID: \"5244bd24-f205-4576-a7cd-6da859f28e21\") " pod="openstack/swift-ring-rebalance-xxj5b" Oct 03 14:59:34 crc kubenswrapper[4774]: I1003 14:59:34.055612 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5244bd24-f205-4576-a7cd-6da859f28e21-etc-swift\") pod \"swift-ring-rebalance-xxj5b\" (UID: \"5244bd24-f205-4576-a7cd-6da859f28e21\") " pod="openstack/swift-ring-rebalance-xxj5b" Oct 03 14:59:34 crc kubenswrapper[4774]: I1003 14:59:34.055603 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5244bd24-f205-4576-a7cd-6da859f28e21-ring-data-devices\") pod \"swift-ring-rebalance-xxj5b\" (UID: \"5244bd24-f205-4576-a7cd-6da859f28e21\") " pod="openstack/swift-ring-rebalance-xxj5b" Oct 03 14:59:34 crc kubenswrapper[4774]: I1003 14:59:34.055725 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5244bd24-f205-4576-a7cd-6da859f28e21-scripts\") pod \"swift-ring-rebalance-xxj5b\" (UID: \"5244bd24-f205-4576-a7cd-6da859f28e21\") " pod="openstack/swift-ring-rebalance-xxj5b" Oct 03 14:59:34 crc kubenswrapper[4774]: I1003 14:59:34.061578 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5244bd24-f205-4576-a7cd-6da859f28e21-combined-ca-bundle\") pod \"swift-ring-rebalance-xxj5b\" (UID: \"5244bd24-f205-4576-a7cd-6da859f28e21\") " pod="openstack/swift-ring-rebalance-xxj5b" Oct 03 14:59:34 crc kubenswrapper[4774]: I1003 14:59:34.061575 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5244bd24-f205-4576-a7cd-6da859f28e21-swiftconf\") pod \"swift-ring-rebalance-xxj5b\" (UID: \"5244bd24-f205-4576-a7cd-6da859f28e21\") " pod="openstack/swift-ring-rebalance-xxj5b" Oct 03 14:59:34 crc kubenswrapper[4774]: I1003 14:59:34.061958 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5244bd24-f205-4576-a7cd-6da859f28e21-dispersionconf\") pod \"swift-ring-rebalance-xxj5b\" (UID: \"5244bd24-f205-4576-a7cd-6da859f28e21\") " pod="openstack/swift-ring-rebalance-xxj5b" Oct 03 14:59:34 crc kubenswrapper[4774]: I1003 14:59:34.070923 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k624m\" (UniqueName: \"kubernetes.io/projected/5244bd24-f205-4576-a7cd-6da859f28e21-kube-api-access-k624m\") pod \"swift-ring-rebalance-xxj5b\" (UID: \"5244bd24-f205-4576-a7cd-6da859f28e21\") " pod="openstack/swift-ring-rebalance-xxj5b" Oct 03 14:59:34 crc kubenswrapper[4774]: I1003 14:59:34.143236 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xxj5b" Oct 03 14:59:34 crc kubenswrapper[4774]: I1003 14:59:34.227357 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"27254696-8788-47fe-a9a4-208fd295e427","Type":"ContainerStarted","Data":"cc4cb2439ce16ba38a50484689cae4481cbddc18c252b85edd0263881c345508"} Oct 03 14:59:34 crc kubenswrapper[4774]: I1003 14:59:34.230789 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66","Type":"ContainerStarted","Data":"ecd34f74e1e00a8d9ce2bf54e5c6ff5a77e65a9b058b47792b6aca05b1148d60"} Oct 03 14:59:34 crc kubenswrapper[4774]: I1003 14:59:34.233310 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bhmcl" event={"ID":"03ee58bb-cd78-4fdb-986f-a9b60f9998e8","Type":"ContainerStarted","Data":"ae16c238e7d9cf2a875a4ae650853f114412b2b155724b924426369a4b7d243a"} Oct 03 14:59:34 crc kubenswrapper[4774]: I1003 14:59:34.233477 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-bhmcl" Oct 03 14:59:34 crc kubenswrapper[4774]: I1003 14:59:34.235529 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a5455d9b-4489-4041-b44d-990124dd84e4","Type":"ContainerStarted","Data":"bf8c79c62f59882f9fadd6563f13b63cade45f5a9e9c2cc529b77d9274c8da27"} Oct 03 14:59:34 crc kubenswrapper[4774]: I1003 14:59:34.239018 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c880318a-5ff5-46f8-aca9-134c52ed3ad1","Type":"ContainerStarted","Data":"164ca8f934aaeba0ec16a37e6e627b1cad10682c68ae774f30e6ab9d81a982d7"} Oct 03 14:59:34 crc kubenswrapper[4774]: I1003 14:59:34.255301 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=16.765825414 podStartE2EDuration="40.255279705s" podCreationTimestamp="2025-10-03 14:58:54 +0000 UTC" firstStartedPulling="2025-10-03 14:58:56.180420652 +0000 UTC m=+958.769624104" lastFinishedPulling="2025-10-03 14:59:19.669874943 +0000 UTC m=+982.259078395" observedRunningTime="2025-10-03 14:59:34.24784328 +0000 UTC m=+996.837046732" watchObservedRunningTime="2025-10-03 14:59:34.255279705 +0000 UTC m=+996.844483167" Oct 03 14:59:34 crc kubenswrapper[4774]: I1003 14:59:34.278531 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=17.639461539 podStartE2EDuration="30.278511894s" podCreationTimestamp="2025-10-03 14:59:04 +0000 UTC" firstStartedPulling="2025-10-03 14:59:20.584207811 +0000 UTC m=+983.173411263" lastFinishedPulling="2025-10-03 14:59:33.223258166 +0000 UTC m=+995.812461618" observedRunningTime="2025-10-03 14:59:34.273647843 +0000 UTC m=+996.862851295" watchObservedRunningTime="2025-10-03 14:59:34.278511894 +0000 UTC m=+996.867715366" Oct 03 14:59:34 crc kubenswrapper[4774]: I1003 14:59:34.300323 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=19.461929446 podStartE2EDuration="32.300310027s" podCreationTimestamp="2025-10-03 14:59:02 +0000 UTC" firstStartedPulling="2025-10-03 14:59:20.389530051 +0000 UTC m=+982.978733503" lastFinishedPulling="2025-10-03 14:59:33.227910622 +0000 UTC m=+995.817114084" observedRunningTime="2025-10-03 14:59:34.297521727 +0000 UTC m=+996.886725179" watchObservedRunningTime="2025-10-03 14:59:34.300310027 +0000 UTC m=+996.889513479" Oct 03 14:59:34 crc kubenswrapper[4774]: I1003 14:59:34.347808 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-bhmcl" podStartSLOduration=24.644998208 podStartE2EDuration="32.345136694s" podCreationTimestamp="2025-10-03 14:59:02 +0000 UTC" firstStartedPulling="2025-10-03 14:59:21.469029594 +0000 UTC m=+984.058233046" lastFinishedPulling="2025-10-03 14:59:29.16916808 +0000 UTC m=+991.758371532" observedRunningTime="2025-10-03 14:59:34.315994318 +0000 UTC m=+996.905197770" watchObservedRunningTime="2025-10-03 14:59:34.345136694 +0000 UTC m=+996.934340176" Oct 03 14:59:34 crc kubenswrapper[4774]: I1003 14:59:34.360259 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=18.024538674 podStartE2EDuration="39.36024403s" podCreationTimestamp="2025-10-03 14:58:55 +0000 UTC" firstStartedPulling="2025-10-03 14:58:57.345845786 +0000 UTC m=+959.935049238" lastFinishedPulling="2025-10-03 14:59:18.681551132 +0000 UTC m=+981.270754594" observedRunningTime="2025-10-03 14:59:34.339925894 +0000 UTC m=+996.929129356" watchObservedRunningTime="2025-10-03 14:59:34.36024403 +0000 UTC m=+996.949447482" Oct 03 14:59:34 crc kubenswrapper[4774]: I1003 14:59:34.662689 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-xxj5b"] Oct 03 14:59:34 crc kubenswrapper[4774]: W1003 14:59:34.669351 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5244bd24_f205_4576_a7cd_6da859f28e21.slice/crio-841d825bb85699b2d63c1f7810438c161344edf5a91a1b3f0b0b7b07ab84bf07 WatchSource:0}: Error finding container 841d825bb85699b2d63c1f7810438c161344edf5a91a1b3f0b0b7b07ab84bf07: Status 404 returned error can't find the container with id 841d825bb85699b2d63c1f7810438c161344edf5a91a1b3f0b0b7b07ab84bf07 Oct 03 14:59:35 crc kubenswrapper[4774]: I1003 14:59:35.253720 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xxj5b" event={"ID":"5244bd24-f205-4576-a7cd-6da859f28e21","Type":"ContainerStarted","Data":"841d825bb85699b2d63c1f7810438c161344edf5a91a1b3f0b0b7b07ab84bf07"} Oct 03 14:59:35 crc kubenswrapper[4774]: I1003 14:59:35.254152 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-bhmcl" Oct 03 14:59:35 crc kubenswrapper[4774]: I1003 14:59:35.461917 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 03 14:59:35 crc kubenswrapper[4774]: I1003 14:59:35.462024 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 03 14:59:35 crc kubenswrapper[4774]: I1003 14:59:35.902920 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 03 14:59:35 crc kubenswrapper[4774]: I1003 14:59:35.903087 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 03 14:59:35 crc kubenswrapper[4774]: I1003 14:59:35.953561 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 03 14:59:36 crc kubenswrapper[4774]: I1003 14:59:36.303824 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 03 14:59:36 crc kubenswrapper[4774]: I1003 14:59:36.644693 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-cmtxq"] Oct 03 14:59:36 crc kubenswrapper[4774]: I1003 14:59:36.675117 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c89d5d749-kzl8b"] Oct 03 14:59:36 crc kubenswrapper[4774]: I1003 14:59:36.677150 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c89d5d749-kzl8b" Oct 03 14:59:36 crc kubenswrapper[4774]: I1003 14:59:36.681119 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 03 14:59:36 crc kubenswrapper[4774]: I1003 14:59:36.695180 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c89d5d749-kzl8b"] Oct 03 14:59:36 crc kubenswrapper[4774]: I1003 14:59:36.702715 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 03 14:59:36 crc kubenswrapper[4774]: I1003 14:59:36.703057 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 03 14:59:36 crc kubenswrapper[4774]: I1003 14:59:36.703957 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57a4a416-3b4b-48cb-9f31-f7fa373f25df-dns-svc\") pod \"dnsmasq-dns-6c89d5d749-kzl8b\" (UID: \"57a4a416-3b4b-48cb-9f31-f7fa373f25df\") " pod="openstack/dnsmasq-dns-6c89d5d749-kzl8b" Oct 03 14:59:36 crc kubenswrapper[4774]: I1003 14:59:36.704193 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x96c7\" (UniqueName: \"kubernetes.io/projected/57a4a416-3b4b-48cb-9f31-f7fa373f25df-kube-api-access-x96c7\") pod \"dnsmasq-dns-6c89d5d749-kzl8b\" (UID: \"57a4a416-3b4b-48cb-9f31-f7fa373f25df\") " pod="openstack/dnsmasq-dns-6c89d5d749-kzl8b" Oct 03 14:59:36 crc kubenswrapper[4774]: I1003 14:59:36.704239 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57a4a416-3b4b-48cb-9f31-f7fa373f25df-config\") pod \"dnsmasq-dns-6c89d5d749-kzl8b\" (UID: \"57a4a416-3b4b-48cb-9f31-f7fa373f25df\") " pod="openstack/dnsmasq-dns-6c89d5d749-kzl8b" Oct 03 14:59:36 crc kubenswrapper[4774]: I1003 14:59:36.704314 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57a4a416-3b4b-48cb-9f31-f7fa373f25df-ovsdbserver-sb\") pod \"dnsmasq-dns-6c89d5d749-kzl8b\" (UID: \"57a4a416-3b4b-48cb-9f31-f7fa373f25df\") " pod="openstack/dnsmasq-dns-6c89d5d749-kzl8b" Oct 03 14:59:36 crc kubenswrapper[4774]: I1003 14:59:36.806385 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x96c7\" (UniqueName: \"kubernetes.io/projected/57a4a416-3b4b-48cb-9f31-f7fa373f25df-kube-api-access-x96c7\") pod \"dnsmasq-dns-6c89d5d749-kzl8b\" (UID: \"57a4a416-3b4b-48cb-9f31-f7fa373f25df\") " pod="openstack/dnsmasq-dns-6c89d5d749-kzl8b" Oct 03 14:59:36 crc kubenswrapper[4774]: I1003 14:59:36.806482 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57a4a416-3b4b-48cb-9f31-f7fa373f25df-config\") pod \"dnsmasq-dns-6c89d5d749-kzl8b\" (UID: \"57a4a416-3b4b-48cb-9f31-f7fa373f25df\") " pod="openstack/dnsmasq-dns-6c89d5d749-kzl8b" Oct 03 14:59:36 crc kubenswrapper[4774]: I1003 14:59:36.806576 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57a4a416-3b4b-48cb-9f31-f7fa373f25df-ovsdbserver-sb\") pod \"dnsmasq-dns-6c89d5d749-kzl8b\" (UID: \"57a4a416-3b4b-48cb-9f31-f7fa373f25df\") " pod="openstack/dnsmasq-dns-6c89d5d749-kzl8b" Oct 03 14:59:36 crc kubenswrapper[4774]: I1003 14:59:36.806679 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57a4a416-3b4b-48cb-9f31-f7fa373f25df-dns-svc\") pod \"dnsmasq-dns-6c89d5d749-kzl8b\" (UID: \"57a4a416-3b4b-48cb-9f31-f7fa373f25df\") " pod="openstack/dnsmasq-dns-6c89d5d749-kzl8b" Oct 03 14:59:36 crc kubenswrapper[4774]: I1003 14:59:36.807451 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57a4a416-3b4b-48cb-9f31-f7fa373f25df-config\") pod \"dnsmasq-dns-6c89d5d749-kzl8b\" (UID: \"57a4a416-3b4b-48cb-9f31-f7fa373f25df\") " pod="openstack/dnsmasq-dns-6c89d5d749-kzl8b" Oct 03 14:59:36 crc kubenswrapper[4774]: I1003 14:59:36.807977 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57a4a416-3b4b-48cb-9f31-f7fa373f25df-dns-svc\") pod \"dnsmasq-dns-6c89d5d749-kzl8b\" (UID: \"57a4a416-3b4b-48cb-9f31-f7fa373f25df\") " pod="openstack/dnsmasq-dns-6c89d5d749-kzl8b" Oct 03 14:59:36 crc kubenswrapper[4774]: I1003 14:59:36.808039 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57a4a416-3b4b-48cb-9f31-f7fa373f25df-ovsdbserver-sb\") pod \"dnsmasq-dns-6c89d5d749-kzl8b\" (UID: \"57a4a416-3b4b-48cb-9f31-f7fa373f25df\") " pod="openstack/dnsmasq-dns-6c89d5d749-kzl8b" Oct 03 14:59:36 crc kubenswrapper[4774]: I1003 14:59:36.855355 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x96c7\" (UniqueName: \"kubernetes.io/projected/57a4a416-3b4b-48cb-9f31-f7fa373f25df-kube-api-access-x96c7\") pod \"dnsmasq-dns-6c89d5d749-kzl8b\" (UID: \"57a4a416-3b4b-48cb-9f31-f7fa373f25df\") " pod="openstack/dnsmasq-dns-6c89d5d749-kzl8b" Oct 03 14:59:36 crc kubenswrapper[4774]: I1003 14:59:36.886139 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-kh252"] Oct 03 14:59:36 crc kubenswrapper[4774]: I1003 14:59:36.887103 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-kh252" Oct 03 14:59:36 crc kubenswrapper[4774]: I1003 14:59:36.889285 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 03 14:59:36 crc kubenswrapper[4774]: I1003 14:59:36.915602 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-kh252"] Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.016503 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b62dc8be-5127-4f62-bdf9-f1db2425c2c1-config\") pod \"ovn-controller-metrics-kh252\" (UID: \"b62dc8be-5127-4f62-bdf9-f1db2425c2c1\") " pod="openstack/ovn-controller-metrics-kh252" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.016594 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/b62dc8be-5127-4f62-bdf9-f1db2425c2c1-ovs-rundir\") pod \"ovn-controller-metrics-kh252\" (UID: \"b62dc8be-5127-4f62-bdf9-f1db2425c2c1\") " pod="openstack/ovn-controller-metrics-kh252" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.016629 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b62dc8be-5127-4f62-bdf9-f1db2425c2c1-combined-ca-bundle\") pod \"ovn-controller-metrics-kh252\" (UID: \"b62dc8be-5127-4f62-bdf9-f1db2425c2c1\") " pod="openstack/ovn-controller-metrics-kh252" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.016660 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/b62dc8be-5127-4f62-bdf9-f1db2425c2c1-ovn-rundir\") pod \"ovn-controller-metrics-kh252\" (UID: \"b62dc8be-5127-4f62-bdf9-f1db2425c2c1\") " pod="openstack/ovn-controller-metrics-kh252" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.016693 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb2zb\" (UniqueName: \"kubernetes.io/projected/b62dc8be-5127-4f62-bdf9-f1db2425c2c1-kube-api-access-nb2zb\") pod \"ovn-controller-metrics-kh252\" (UID: \"b62dc8be-5127-4f62-bdf9-f1db2425c2c1\") " pod="openstack/ovn-controller-metrics-kh252" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.016736 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b62dc8be-5127-4f62-bdf9-f1db2425c2c1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-kh252\" (UID: \"b62dc8be-5127-4f62-bdf9-f1db2425c2c1\") " pod="openstack/ovn-controller-metrics-kh252" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.019810 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c89d5d749-kzl8b" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.088953 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-jwmwm"] Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.089186 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7cb5889db5-jwmwm" podUID="169fd512-6192-4597-a89b-cf9b0c6b76b1" containerName="dnsmasq-dns" containerID="cri-o://98189ad2c12124c7a82be7808867d5919631c27e2d5943c5e2d1f6d573c52d7f" gracePeriod=10 Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.091953 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7cb5889db5-jwmwm" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.113952 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-h54hg"] Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.131571 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/b62dc8be-5127-4f62-bdf9-f1db2425c2c1-ovs-rundir\") pod \"ovn-controller-metrics-kh252\" (UID: \"b62dc8be-5127-4f62-bdf9-f1db2425c2c1\") " pod="openstack/ovn-controller-metrics-kh252" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.131750 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b62dc8be-5127-4f62-bdf9-f1db2425c2c1-combined-ca-bundle\") pod \"ovn-controller-metrics-kh252\" (UID: \"b62dc8be-5127-4f62-bdf9-f1db2425c2c1\") " pod="openstack/ovn-controller-metrics-kh252" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.131793 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/b62dc8be-5127-4f62-bdf9-f1db2425c2c1-ovn-rundir\") pod \"ovn-controller-metrics-kh252\" (UID: \"b62dc8be-5127-4f62-bdf9-f1db2425c2c1\") " pod="openstack/ovn-controller-metrics-kh252" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.131826 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb2zb\" (UniqueName: \"kubernetes.io/projected/b62dc8be-5127-4f62-bdf9-f1db2425c2c1-kube-api-access-nb2zb\") pod \"ovn-controller-metrics-kh252\" (UID: \"b62dc8be-5127-4f62-bdf9-f1db2425c2c1\") " pod="openstack/ovn-controller-metrics-kh252" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.131873 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b62dc8be-5127-4f62-bdf9-f1db2425c2c1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-kh252\" (UID: \"b62dc8be-5127-4f62-bdf9-f1db2425c2c1\") " pod="openstack/ovn-controller-metrics-kh252" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.131899 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b62dc8be-5127-4f62-bdf9-f1db2425c2c1-config\") pod \"ovn-controller-metrics-kh252\" (UID: \"b62dc8be-5127-4f62-bdf9-f1db2425c2c1\") " pod="openstack/ovn-controller-metrics-kh252" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.132771 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b62dc8be-5127-4f62-bdf9-f1db2425c2c1-config\") pod \"ovn-controller-metrics-kh252\" (UID: \"b62dc8be-5127-4f62-bdf9-f1db2425c2c1\") " pod="openstack/ovn-controller-metrics-kh252" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.133661 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/b62dc8be-5127-4f62-bdf9-f1db2425c2c1-ovn-rundir\") pod \"ovn-controller-metrics-kh252\" (UID: \"b62dc8be-5127-4f62-bdf9-f1db2425c2c1\") " pod="openstack/ovn-controller-metrics-kh252" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.133828 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/b62dc8be-5127-4f62-bdf9-f1db2425c2c1-ovs-rundir\") pod \"ovn-controller-metrics-kh252\" (UID: \"b62dc8be-5127-4f62-bdf9-f1db2425c2c1\") " pod="openstack/ovn-controller-metrics-kh252" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.140736 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b62dc8be-5127-4f62-bdf9-f1db2425c2c1-combined-ca-bundle\") pod \"ovn-controller-metrics-kh252\" (UID: \"b62dc8be-5127-4f62-bdf9-f1db2425c2c1\") " pod="openstack/ovn-controller-metrics-kh252" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.140999 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-h54hg"] Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.141120 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-h54hg" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.150034 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b62dc8be-5127-4f62-bdf9-f1db2425c2c1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-kh252\" (UID: \"b62dc8be-5127-4f62-bdf9-f1db2425c2c1\") " pod="openstack/ovn-controller-metrics-kh252" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.150670 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.168493 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb2zb\" (UniqueName: \"kubernetes.io/projected/b62dc8be-5127-4f62-bdf9-f1db2425c2c1-kube-api-access-nb2zb\") pod \"ovn-controller-metrics-kh252\" (UID: \"b62dc8be-5127-4f62-bdf9-f1db2425c2c1\") " pod="openstack/ovn-controller-metrics-kh252" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.172698 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.229731 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.246484 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-kh252" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.266982 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.325304 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.334278 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/403b4516-d6b7-408a-9013-6789d9d99abf-config\") pod \"dnsmasq-dns-698758b865-h54hg\" (UID: \"403b4516-d6b7-408a-9013-6789d9d99abf\") " pod="openstack/dnsmasq-dns-698758b865-h54hg" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.334336 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qt5b\" (UniqueName: \"kubernetes.io/projected/403b4516-d6b7-408a-9013-6789d9d99abf-kube-api-access-8qt5b\") pod \"dnsmasq-dns-698758b865-h54hg\" (UID: \"403b4516-d6b7-408a-9013-6789d9d99abf\") " pod="openstack/dnsmasq-dns-698758b865-h54hg" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.334401 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/403b4516-d6b7-408a-9013-6789d9d99abf-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-h54hg\" (UID: \"403b4516-d6b7-408a-9013-6789d9d99abf\") " pod="openstack/dnsmasq-dns-698758b865-h54hg" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.334502 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/403b4516-d6b7-408a-9013-6789d9d99abf-dns-svc\") pod \"dnsmasq-dns-698758b865-h54hg\" (UID: \"403b4516-d6b7-408a-9013-6789d9d99abf\") " pod="openstack/dnsmasq-dns-698758b865-h54hg" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.334764 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/403b4516-d6b7-408a-9013-6789d9d99abf-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-h54hg\" (UID: \"403b4516-d6b7-408a-9013-6789d9d99abf\") " pod="openstack/dnsmasq-dns-698758b865-h54hg" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.436193 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/403b4516-d6b7-408a-9013-6789d9d99abf-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-h54hg\" (UID: \"403b4516-d6b7-408a-9013-6789d9d99abf\") " pod="openstack/dnsmasq-dns-698758b865-h54hg" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.436349 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/403b4516-d6b7-408a-9013-6789d9d99abf-dns-svc\") pod \"dnsmasq-dns-698758b865-h54hg\" (UID: \"403b4516-d6b7-408a-9013-6789d9d99abf\") " pod="openstack/dnsmasq-dns-698758b865-h54hg" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.436460 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/403b4516-d6b7-408a-9013-6789d9d99abf-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-h54hg\" (UID: \"403b4516-d6b7-408a-9013-6789d9d99abf\") " pod="openstack/dnsmasq-dns-698758b865-h54hg" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.436542 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/403b4516-d6b7-408a-9013-6789d9d99abf-config\") pod \"dnsmasq-dns-698758b865-h54hg\" (UID: \"403b4516-d6b7-408a-9013-6789d9d99abf\") " pod="openstack/dnsmasq-dns-698758b865-h54hg" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.436589 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qt5b\" (UniqueName: \"kubernetes.io/projected/403b4516-d6b7-408a-9013-6789d9d99abf-kube-api-access-8qt5b\") pod \"dnsmasq-dns-698758b865-h54hg\" (UID: \"403b4516-d6b7-408a-9013-6789d9d99abf\") " pod="openstack/dnsmasq-dns-698758b865-h54hg" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.438192 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/403b4516-d6b7-408a-9013-6789d9d99abf-dns-svc\") pod \"dnsmasq-dns-698758b865-h54hg\" (UID: \"403b4516-d6b7-408a-9013-6789d9d99abf\") " pod="openstack/dnsmasq-dns-698758b865-h54hg" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.438587 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/403b4516-d6b7-408a-9013-6789d9d99abf-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-h54hg\" (UID: \"403b4516-d6b7-408a-9013-6789d9d99abf\") " pod="openstack/dnsmasq-dns-698758b865-h54hg" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.439360 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/403b4516-d6b7-408a-9013-6789d9d99abf-config\") pod \"dnsmasq-dns-698758b865-h54hg\" (UID: \"403b4516-d6b7-408a-9013-6789d9d99abf\") " pod="openstack/dnsmasq-dns-698758b865-h54hg" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.441561 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/403b4516-d6b7-408a-9013-6789d9d99abf-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-h54hg\" (UID: \"403b4516-d6b7-408a-9013-6789d9d99abf\") " pod="openstack/dnsmasq-dns-698758b865-h54hg" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.457301 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qt5b\" (UniqueName: \"kubernetes.io/projected/403b4516-d6b7-408a-9013-6789d9d99abf-kube-api-access-8qt5b\") pod \"dnsmasq-dns-698758b865-h54hg\" (UID: \"403b4516-d6b7-408a-9013-6789d9d99abf\") " pod="openstack/dnsmasq-dns-698758b865-h54hg" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.539097 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-h54hg" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.640751 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.642281 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.644666 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.644816 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.645046 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.645501 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-lm7wl" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.650546 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.740048 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/148decd8-3262-44d1-858e-523458f7c1ee-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"148decd8-3262-44d1-858e-523458f7c1ee\") " pod="openstack/ovn-northd-0" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.740101 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/148decd8-3262-44d1-858e-523458f7c1ee-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"148decd8-3262-44d1-858e-523458f7c1ee\") " pod="openstack/ovn-northd-0" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.740140 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/148decd8-3262-44d1-858e-523458f7c1ee-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"148decd8-3262-44d1-858e-523458f7c1ee\") " pod="openstack/ovn-northd-0" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.740162 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/148decd8-3262-44d1-858e-523458f7c1ee-scripts\") pod \"ovn-northd-0\" (UID: \"148decd8-3262-44d1-858e-523458f7c1ee\") " pod="openstack/ovn-northd-0" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.740177 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/148decd8-3262-44d1-858e-523458f7c1ee-config\") pod \"ovn-northd-0\" (UID: \"148decd8-3262-44d1-858e-523458f7c1ee\") " pod="openstack/ovn-northd-0" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.740213 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/148decd8-3262-44d1-858e-523458f7c1ee-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"148decd8-3262-44d1-858e-523458f7c1ee\") " pod="openstack/ovn-northd-0" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.740242 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2wxm\" (UniqueName: \"kubernetes.io/projected/148decd8-3262-44d1-858e-523458f7c1ee-kube-api-access-j2wxm\") pod \"ovn-northd-0\" (UID: \"148decd8-3262-44d1-858e-523458f7c1ee\") " pod="openstack/ovn-northd-0" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.842307 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/148decd8-3262-44d1-858e-523458f7c1ee-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"148decd8-3262-44d1-858e-523458f7c1ee\") " pod="openstack/ovn-northd-0" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.842391 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/148decd8-3262-44d1-858e-523458f7c1ee-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"148decd8-3262-44d1-858e-523458f7c1ee\") " pod="openstack/ovn-northd-0" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.842446 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/148decd8-3262-44d1-858e-523458f7c1ee-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"148decd8-3262-44d1-858e-523458f7c1ee\") " pod="openstack/ovn-northd-0" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.842484 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/148decd8-3262-44d1-858e-523458f7c1ee-scripts\") pod \"ovn-northd-0\" (UID: \"148decd8-3262-44d1-858e-523458f7c1ee\") " pod="openstack/ovn-northd-0" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.842507 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/148decd8-3262-44d1-858e-523458f7c1ee-config\") pod \"ovn-northd-0\" (UID: \"148decd8-3262-44d1-858e-523458f7c1ee\") " pod="openstack/ovn-northd-0" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.842552 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/148decd8-3262-44d1-858e-523458f7c1ee-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"148decd8-3262-44d1-858e-523458f7c1ee\") " pod="openstack/ovn-northd-0" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.842586 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2wxm\" (UniqueName: \"kubernetes.io/projected/148decd8-3262-44d1-858e-523458f7c1ee-kube-api-access-j2wxm\") pod \"ovn-northd-0\" (UID: \"148decd8-3262-44d1-858e-523458f7c1ee\") " pod="openstack/ovn-northd-0" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.842679 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bc0b2c39-8c1e-4401-97f6-a4306b435436-etc-swift\") pod \"swift-storage-0\" (UID: \"bc0b2c39-8c1e-4401-97f6-a4306b435436\") " pod="openstack/swift-storage-0" Oct 03 14:59:37 crc kubenswrapper[4774]: E1003 14:59:37.842880 4774 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 14:59:37 crc kubenswrapper[4774]: E1003 14:59:37.842926 4774 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 14:59:37 crc kubenswrapper[4774]: E1003 14:59:37.843012 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc0b2c39-8c1e-4401-97f6-a4306b435436-etc-swift podName:bc0b2c39-8c1e-4401-97f6-a4306b435436 nodeName:}" failed. No retries permitted until 2025-10-03 14:59:45.842990412 +0000 UTC m=+1008.432193874 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bc0b2c39-8c1e-4401-97f6-a4306b435436-etc-swift") pod "swift-storage-0" (UID: "bc0b2c39-8c1e-4401-97f6-a4306b435436") : configmap "swift-ring-files" not found Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.843181 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/148decd8-3262-44d1-858e-523458f7c1ee-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"148decd8-3262-44d1-858e-523458f7c1ee\") " pod="openstack/ovn-northd-0" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.843468 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/148decd8-3262-44d1-858e-523458f7c1ee-scripts\") pod \"ovn-northd-0\" (UID: \"148decd8-3262-44d1-858e-523458f7c1ee\") " pod="openstack/ovn-northd-0" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.843945 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/148decd8-3262-44d1-858e-523458f7c1ee-config\") pod \"ovn-northd-0\" (UID: \"148decd8-3262-44d1-858e-523458f7c1ee\") " pod="openstack/ovn-northd-0" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.849318 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/148decd8-3262-44d1-858e-523458f7c1ee-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"148decd8-3262-44d1-858e-523458f7c1ee\") " pod="openstack/ovn-northd-0" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.849399 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/148decd8-3262-44d1-858e-523458f7c1ee-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"148decd8-3262-44d1-858e-523458f7c1ee\") " pod="openstack/ovn-northd-0" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.849447 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/148decd8-3262-44d1-858e-523458f7c1ee-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"148decd8-3262-44d1-858e-523458f7c1ee\") " pod="openstack/ovn-northd-0" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.870549 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2wxm\" (UniqueName: \"kubernetes.io/projected/148decd8-3262-44d1-858e-523458f7c1ee-kube-api-access-j2wxm\") pod \"ovn-northd-0\" (UID: \"148decd8-3262-44d1-858e-523458f7c1ee\") " pod="openstack/ovn-northd-0" Oct 03 14:59:37 crc kubenswrapper[4774]: I1003 14:59:37.971739 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 03 14:59:38 crc kubenswrapper[4774]: I1003 14:59:38.274282 4774 generic.go:334] "Generic (PLEG): container finished" podID="169fd512-6192-4597-a89b-cf9b0c6b76b1" containerID="98189ad2c12124c7a82be7808867d5919631c27e2d5943c5e2d1f6d573c52d7f" exitCode=0 Oct 03 14:59:38 crc kubenswrapper[4774]: I1003 14:59:38.274381 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-jwmwm" event={"ID":"169fd512-6192-4597-a89b-cf9b0c6b76b1","Type":"ContainerDied","Data":"98189ad2c12124c7a82be7808867d5919631c27e2d5943c5e2d1f6d573c52d7f"} Oct 03 14:59:38 crc kubenswrapper[4774]: I1003 14:59:38.678495 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 03 14:59:38 crc kubenswrapper[4774]: I1003 14:59:38.787815 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 03 14:59:38 crc kubenswrapper[4774]: I1003 14:59:38.840303 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 03 14:59:39 crc kubenswrapper[4774]: I1003 14:59:39.066106 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7cb5889db5-jwmwm" podUID="169fd512-6192-4597-a89b-cf9b0c6b76b1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.110:5353: connect: connection refused" Oct 03 14:59:39 crc kubenswrapper[4774]: I1003 14:59:39.074191 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-cmtxq" Oct 03 14:59:39 crc kubenswrapper[4774]: I1003 14:59:39.171171 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5446\" (UniqueName: \"kubernetes.io/projected/b91a9c71-b83c-4e14-957f-820cba700771-kube-api-access-r5446\") pod \"b91a9c71-b83c-4e14-957f-820cba700771\" (UID: \"b91a9c71-b83c-4e14-957f-820cba700771\") " Oct 03 14:59:39 crc kubenswrapper[4774]: I1003 14:59:39.171313 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b91a9c71-b83c-4e14-957f-820cba700771-config\") pod \"b91a9c71-b83c-4e14-957f-820cba700771\" (UID: \"b91a9c71-b83c-4e14-957f-820cba700771\") " Oct 03 14:59:39 crc kubenswrapper[4774]: I1003 14:59:39.171406 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b91a9c71-b83c-4e14-957f-820cba700771-dns-svc\") pod \"b91a9c71-b83c-4e14-957f-820cba700771\" (UID: \"b91a9c71-b83c-4e14-957f-820cba700771\") " Oct 03 14:59:39 crc kubenswrapper[4774]: I1003 14:59:39.182577 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b91a9c71-b83c-4e14-957f-820cba700771-kube-api-access-r5446" (OuterVolumeSpecName: "kube-api-access-r5446") pod "b91a9c71-b83c-4e14-957f-820cba700771" (UID: "b91a9c71-b83c-4e14-957f-820cba700771"). InnerVolumeSpecName "kube-api-access-r5446". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:59:39 crc kubenswrapper[4774]: I1003 14:59:39.198417 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b91a9c71-b83c-4e14-957f-820cba700771-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b91a9c71-b83c-4e14-957f-820cba700771" (UID: "b91a9c71-b83c-4e14-957f-820cba700771"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:59:39 crc kubenswrapper[4774]: I1003 14:59:39.214074 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b91a9c71-b83c-4e14-957f-820cba700771-config" (OuterVolumeSpecName: "config") pod "b91a9c71-b83c-4e14-957f-820cba700771" (UID: "b91a9c71-b83c-4e14-957f-820cba700771"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:59:39 crc kubenswrapper[4774]: I1003 14:59:39.273568 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5446\" (UniqueName: \"kubernetes.io/projected/b91a9c71-b83c-4e14-957f-820cba700771-kube-api-access-r5446\") on node \"crc\" DevicePath \"\"" Oct 03 14:59:39 crc kubenswrapper[4774]: I1003 14:59:39.273602 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b91a9c71-b83c-4e14-957f-820cba700771-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:59:39 crc kubenswrapper[4774]: I1003 14:59:39.273614 4774 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b91a9c71-b83c-4e14-957f-820cba700771-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:59:39 crc kubenswrapper[4774]: I1003 14:59:39.289812 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-cmtxq" event={"ID":"b91a9c71-b83c-4e14-957f-820cba700771","Type":"ContainerDied","Data":"d58d6367d94351bb56b0ff2e431904b9b441d7c7c4a46b9226a08a7d638883ae"} Oct 03 14:59:39 crc kubenswrapper[4774]: I1003 14:59:39.289823 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-cmtxq" Oct 03 14:59:39 crc kubenswrapper[4774]: I1003 14:59:39.368592 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-cmtxq"] Oct 03 14:59:39 crc kubenswrapper[4774]: I1003 14:59:39.372708 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-cmtxq"] Oct 03 14:59:39 crc kubenswrapper[4774]: I1003 14:59:39.395988 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-jwmwm" Oct 03 14:59:39 crc kubenswrapper[4774]: I1003 14:59:39.476071 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/169fd512-6192-4597-a89b-cf9b0c6b76b1-config\") pod \"169fd512-6192-4597-a89b-cf9b0c6b76b1\" (UID: \"169fd512-6192-4597-a89b-cf9b0c6b76b1\") " Oct 03 14:59:39 crc kubenswrapper[4774]: I1003 14:59:39.476175 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rglrs\" (UniqueName: \"kubernetes.io/projected/169fd512-6192-4597-a89b-cf9b0c6b76b1-kube-api-access-rglrs\") pod \"169fd512-6192-4597-a89b-cf9b0c6b76b1\" (UID: \"169fd512-6192-4597-a89b-cf9b0c6b76b1\") " Oct 03 14:59:39 crc kubenswrapper[4774]: I1003 14:59:39.476601 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/169fd512-6192-4597-a89b-cf9b0c6b76b1-dns-svc\") pod \"169fd512-6192-4597-a89b-cf9b0c6b76b1\" (UID: \"169fd512-6192-4597-a89b-cf9b0c6b76b1\") " Oct 03 14:59:39 crc kubenswrapper[4774]: I1003 14:59:39.481986 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/169fd512-6192-4597-a89b-cf9b0c6b76b1-kube-api-access-rglrs" (OuterVolumeSpecName: "kube-api-access-rglrs") pod "169fd512-6192-4597-a89b-cf9b0c6b76b1" (UID: "169fd512-6192-4597-a89b-cf9b0c6b76b1"). InnerVolumeSpecName "kube-api-access-rglrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:59:39 crc kubenswrapper[4774]: I1003 14:59:39.519164 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/169fd512-6192-4597-a89b-cf9b0c6b76b1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "169fd512-6192-4597-a89b-cf9b0c6b76b1" (UID: "169fd512-6192-4597-a89b-cf9b0c6b76b1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:59:39 crc kubenswrapper[4774]: I1003 14:59:39.523891 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/169fd512-6192-4597-a89b-cf9b0c6b76b1-config" (OuterVolumeSpecName: "config") pod "169fd512-6192-4597-a89b-cf9b0c6b76b1" (UID: "169fd512-6192-4597-a89b-cf9b0c6b76b1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:59:39 crc kubenswrapper[4774]: I1003 14:59:39.578623 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/169fd512-6192-4597-a89b-cf9b0c6b76b1-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:59:39 crc kubenswrapper[4774]: I1003 14:59:39.578920 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rglrs\" (UniqueName: \"kubernetes.io/projected/169fd512-6192-4597-a89b-cf9b0c6b76b1-kube-api-access-rglrs\") on node \"crc\" DevicePath \"\"" Oct 03 14:59:39 crc kubenswrapper[4774]: I1003 14:59:39.578937 4774 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/169fd512-6192-4597-a89b-cf9b0c6b76b1-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:59:39 crc kubenswrapper[4774]: I1003 14:59:39.579560 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c89d5d749-kzl8b"] Oct 03 14:59:39 crc kubenswrapper[4774]: W1003 14:59:39.583664 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod403b4516_d6b7_408a_9013_6789d9d99abf.slice/crio-1647faa3af5c1796971e13c27367bc28b4a0153995ec8757a35cd5b4e37c99c7 WatchSource:0}: Error finding container 1647faa3af5c1796971e13c27367bc28b4a0153995ec8757a35cd5b4e37c99c7: Status 404 returned error can't find the container with id 1647faa3af5c1796971e13c27367bc28b4a0153995ec8757a35cd5b4e37c99c7 Oct 03 14:59:39 crc kubenswrapper[4774]: I1003 14:59:39.587282 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-h54hg"] Oct 03 14:59:39 crc kubenswrapper[4774]: W1003 14:59:39.588029 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb62dc8be_5127_4f62_bdf9_f1db2425c2c1.slice/crio-86cbefc12448901ac006ae91c5204d13d753f04581d6331060da685607aa3b73 WatchSource:0}: Error finding container 86cbefc12448901ac006ae91c5204d13d753f04581d6331060da685607aa3b73: Status 404 returned error can't find the container with id 86cbefc12448901ac006ae91c5204d13d753f04581d6331060da685607aa3b73 Oct 03 14:59:39 crc kubenswrapper[4774]: I1003 14:59:39.595015 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-kh252"] Oct 03 14:59:39 crc kubenswrapper[4774]: I1003 14:59:39.766809 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 03 14:59:39 crc kubenswrapper[4774]: W1003 14:59:39.790086 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod148decd8_3262_44d1_858e_523458f7c1ee.slice/crio-611face4776aa899a117ed40d9ac90c5c06b7b361a586ed618d032b0ba5e790c WatchSource:0}: Error finding container 611face4776aa899a117ed40d9ac90c5c06b7b361a586ed618d032b0ba5e790c: Status 404 returned error can't find the container with id 611face4776aa899a117ed40d9ac90c5c06b7b361a586ed618d032b0ba5e790c Oct 03 14:59:40 crc kubenswrapper[4774]: I1003 14:59:40.304199 4774 generic.go:334] "Generic (PLEG): container finished" podID="57a4a416-3b4b-48cb-9f31-f7fa373f25df" containerID="05346c83c99d951814371acf83b6147466553a0e3812f207f91f37d8c524eaa1" exitCode=0 Oct 03 14:59:40 crc kubenswrapper[4774]: I1003 14:59:40.304278 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c89d5d749-kzl8b" event={"ID":"57a4a416-3b4b-48cb-9f31-f7fa373f25df","Type":"ContainerDied","Data":"05346c83c99d951814371acf83b6147466553a0e3812f207f91f37d8c524eaa1"} Oct 03 14:59:40 crc kubenswrapper[4774]: I1003 14:59:40.304544 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c89d5d749-kzl8b" event={"ID":"57a4a416-3b4b-48cb-9f31-f7fa373f25df","Type":"ContainerStarted","Data":"303d00979e9a5f7227409852a9d6105d2b531168e21aef835ac1bbb89d32d4e4"} Oct 03 14:59:40 crc kubenswrapper[4774]: I1003 14:59:40.312306 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-jwmwm" event={"ID":"169fd512-6192-4597-a89b-cf9b0c6b76b1","Type":"ContainerDied","Data":"79e5d50e2d3608cc6fb8d213e5516485f1375afd983aa6042942730ac2b876df"} Oct 03 14:59:40 crc kubenswrapper[4774]: I1003 14:59:40.312340 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-jwmwm" Oct 03 14:59:40 crc kubenswrapper[4774]: I1003 14:59:40.312368 4774 scope.go:117] "RemoveContainer" containerID="98189ad2c12124c7a82be7808867d5919631c27e2d5943c5e2d1f6d573c52d7f" Oct 03 14:59:40 crc kubenswrapper[4774]: I1003 14:59:40.319627 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-kh252" event={"ID":"b62dc8be-5127-4f62-bdf9-f1db2425c2c1","Type":"ContainerStarted","Data":"2ade23e5ff78205f5ff1e98bc94cd270578ffeadf951cda2c444d5795fde38c6"} Oct 03 14:59:40 crc kubenswrapper[4774]: I1003 14:59:40.319675 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-kh252" event={"ID":"b62dc8be-5127-4f62-bdf9-f1db2425c2c1","Type":"ContainerStarted","Data":"86cbefc12448901ac006ae91c5204d13d753f04581d6331060da685607aa3b73"} Oct 03 14:59:40 crc kubenswrapper[4774]: I1003 14:59:40.322271 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xxj5b" event={"ID":"5244bd24-f205-4576-a7cd-6da859f28e21","Type":"ContainerStarted","Data":"aed5639b19e590a01510f40d3a49a4107ad88cc8b2aa7f3d2a9be5595631606b"} Oct 03 14:59:40 crc kubenswrapper[4774]: I1003 14:59:40.324593 4774 generic.go:334] "Generic (PLEG): container finished" podID="403b4516-d6b7-408a-9013-6789d9d99abf" containerID="d15b8be91d87262ff0eac3104ca8c81ceeed645e51cdffd75373e46913752aa5" exitCode=0 Oct 03 14:59:40 crc kubenswrapper[4774]: I1003 14:59:40.324656 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-h54hg" event={"ID":"403b4516-d6b7-408a-9013-6789d9d99abf","Type":"ContainerDied","Data":"d15b8be91d87262ff0eac3104ca8c81ceeed645e51cdffd75373e46913752aa5"} Oct 03 14:59:40 crc kubenswrapper[4774]: I1003 14:59:40.324705 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-h54hg" event={"ID":"403b4516-d6b7-408a-9013-6789d9d99abf","Type":"ContainerStarted","Data":"1647faa3af5c1796971e13c27367bc28b4a0153995ec8757a35cd5b4e37c99c7"} Oct 03 14:59:40 crc kubenswrapper[4774]: I1003 14:59:40.325495 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"148decd8-3262-44d1-858e-523458f7c1ee","Type":"ContainerStarted","Data":"611face4776aa899a117ed40d9ac90c5c06b7b361a586ed618d032b0ba5e790c"} Oct 03 14:59:40 crc kubenswrapper[4774]: I1003 14:59:40.357838 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-kh252" podStartSLOduration=4.357820573 podStartE2EDuration="4.357820573s" podCreationTimestamp="2025-10-03 14:59:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:59:40.357076794 +0000 UTC m=+1002.946280246" watchObservedRunningTime="2025-10-03 14:59:40.357820573 +0000 UTC m=+1002.947024035" Oct 03 14:59:40 crc kubenswrapper[4774]: I1003 14:59:40.419271 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-xxj5b" podStartSLOduration=2.982436592 podStartE2EDuration="7.419253693s" podCreationTimestamp="2025-10-03 14:59:33 +0000 UTC" firstStartedPulling="2025-10-03 14:59:34.672471408 +0000 UTC m=+997.261674860" lastFinishedPulling="2025-10-03 14:59:39.109288489 +0000 UTC m=+1001.698491961" observedRunningTime="2025-10-03 14:59:40.41470932 +0000 UTC m=+1003.003912792" watchObservedRunningTime="2025-10-03 14:59:40.419253693 +0000 UTC m=+1003.008457145" Oct 03 14:59:40 crc kubenswrapper[4774]: I1003 14:59:40.522103 4774 scope.go:117] "RemoveContainer" containerID="434c2a1209f9be7e0d01cfc00c482daf1bf212471541c2a64d486a17aff19d55" Oct 03 14:59:40 crc kubenswrapper[4774]: I1003 14:59:40.535421 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-jwmwm"] Oct 03 14:59:40 crc kubenswrapper[4774]: I1003 14:59:40.540502 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-jwmwm"] Oct 03 14:59:41 crc kubenswrapper[4774]: I1003 14:59:41.311853 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="169fd512-6192-4597-a89b-cf9b0c6b76b1" path="/var/lib/kubelet/pods/169fd512-6192-4597-a89b-cf9b0c6b76b1/volumes" Oct 03 14:59:41 crc kubenswrapper[4774]: I1003 14:59:41.314270 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b91a9c71-b83c-4e14-957f-820cba700771" path="/var/lib/kubelet/pods/b91a9c71-b83c-4e14-957f-820cba700771/volumes" Oct 03 14:59:41 crc kubenswrapper[4774]: I1003 14:59:41.335244 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-h54hg" event={"ID":"403b4516-d6b7-408a-9013-6789d9d99abf","Type":"ContainerStarted","Data":"4c4b5a3f1c9d59a7a4e9565b18e824792f91d86a248402a2a146c1b168045fc3"} Oct 03 14:59:41 crc kubenswrapper[4774]: I1003 14:59:41.335332 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-h54hg" Oct 03 14:59:41 crc kubenswrapper[4774]: I1003 14:59:41.337916 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"148decd8-3262-44d1-858e-523458f7c1ee","Type":"ContainerStarted","Data":"ca440c78fbd9910e821275b8b356df8115021b00a76ab3579663e4112b7369e0"} Oct 03 14:59:41 crc kubenswrapper[4774]: I1003 14:59:41.344247 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c89d5d749-kzl8b" event={"ID":"57a4a416-3b4b-48cb-9f31-f7fa373f25df","Type":"ContainerStarted","Data":"86573619be71ee4a7de1e61fbc435b5cbb77829e578c4873438dba6416d337f9"} Oct 03 14:59:41 crc kubenswrapper[4774]: I1003 14:59:41.344767 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c89d5d749-kzl8b" Oct 03 14:59:41 crc kubenswrapper[4774]: I1003 14:59:41.371341 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-h54hg" podStartSLOduration=4.37132582 podStartE2EDuration="4.37132582s" podCreationTimestamp="2025-10-03 14:59:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:59:41.365311581 +0000 UTC m=+1003.954515043" watchObservedRunningTime="2025-10-03 14:59:41.37132582 +0000 UTC m=+1003.960529272" Oct 03 14:59:41 crc kubenswrapper[4774]: I1003 14:59:41.385100 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c89d5d749-kzl8b" podStartSLOduration=5.385082003 podStartE2EDuration="5.385082003s" podCreationTimestamp="2025-10-03 14:59:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:59:41.382253993 +0000 UTC m=+1003.971457455" watchObservedRunningTime="2025-10-03 14:59:41.385082003 +0000 UTC m=+1003.974285455" Oct 03 14:59:41 crc kubenswrapper[4774]: I1003 14:59:41.540126 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 03 14:59:41 crc kubenswrapper[4774]: I1003 14:59:41.594065 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 03 14:59:42 crc kubenswrapper[4774]: I1003 14:59:42.358435 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"148decd8-3262-44d1-858e-523458f7c1ee","Type":"ContainerStarted","Data":"84277c260c955161978c3216aab1887fe9eab9e9d76f6c40434b7581e78ea445"} Oct 03 14:59:42 crc kubenswrapper[4774]: I1003 14:59:42.378577 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=4.185088182 podStartE2EDuration="5.378557263s" podCreationTimestamp="2025-10-03 14:59:37 +0000 UTC" firstStartedPulling="2025-10-03 14:59:39.792566171 +0000 UTC m=+1002.381769623" lastFinishedPulling="2025-10-03 14:59:40.986035252 +0000 UTC m=+1003.575238704" observedRunningTime="2025-10-03 14:59:42.373562688 +0000 UTC m=+1004.962766150" watchObservedRunningTime="2025-10-03 14:59:42.378557263 +0000 UTC m=+1004.967760715" Oct 03 14:59:42 crc kubenswrapper[4774]: I1003 14:59:42.972065 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 03 14:59:45 crc kubenswrapper[4774]: I1003 14:59:45.890336 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bc0b2c39-8c1e-4401-97f6-a4306b435436-etc-swift\") pod \"swift-storage-0\" (UID: \"bc0b2c39-8c1e-4401-97f6-a4306b435436\") " pod="openstack/swift-storage-0" Oct 03 14:59:45 crc kubenswrapper[4774]: E1003 14:59:45.890633 4774 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 14:59:45 crc kubenswrapper[4774]: E1003 14:59:45.890977 4774 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 14:59:45 crc kubenswrapper[4774]: E1003 14:59:45.891038 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc0b2c39-8c1e-4401-97f6-a4306b435436-etc-swift podName:bc0b2c39-8c1e-4401-97f6-a4306b435436 nodeName:}" failed. No retries permitted until 2025-10-03 15:00:01.891020656 +0000 UTC m=+1024.480224108 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bc0b2c39-8c1e-4401-97f6-a4306b435436-etc-swift") pod "swift-storage-0" (UID: "bc0b2c39-8c1e-4401-97f6-a4306b435436") : configmap "swift-ring-files" not found Oct 03 14:59:46 crc kubenswrapper[4774]: I1003 14:59:46.734363 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-jpslj"] Oct 03 14:59:46 crc kubenswrapper[4774]: E1003 14:59:46.735110 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="169fd512-6192-4597-a89b-cf9b0c6b76b1" containerName="init" Oct 03 14:59:46 crc kubenswrapper[4774]: I1003 14:59:46.735124 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="169fd512-6192-4597-a89b-cf9b0c6b76b1" containerName="init" Oct 03 14:59:46 crc kubenswrapper[4774]: E1003 14:59:46.735145 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="169fd512-6192-4597-a89b-cf9b0c6b76b1" containerName="dnsmasq-dns" Oct 03 14:59:46 crc kubenswrapper[4774]: I1003 14:59:46.735150 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="169fd512-6192-4597-a89b-cf9b0c6b76b1" containerName="dnsmasq-dns" Oct 03 14:59:46 crc kubenswrapper[4774]: I1003 14:59:46.735338 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="169fd512-6192-4597-a89b-cf9b0c6b76b1" containerName="dnsmasq-dns" Oct 03 14:59:46 crc kubenswrapper[4774]: I1003 14:59:46.736243 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jpslj" Oct 03 14:59:46 crc kubenswrapper[4774]: I1003 14:59:46.754266 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-jpslj"] Oct 03 14:59:46 crc kubenswrapper[4774]: I1003 14:59:46.805760 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhqpd\" (UniqueName: \"kubernetes.io/projected/109aa946-04bb-44f0-be87-89af984dd880-kube-api-access-vhqpd\") pod \"keystone-db-create-jpslj\" (UID: \"109aa946-04bb-44f0-be87-89af984dd880\") " pod="openstack/keystone-db-create-jpslj" Oct 03 14:59:46 crc kubenswrapper[4774]: I1003 14:59:46.908235 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhqpd\" (UniqueName: \"kubernetes.io/projected/109aa946-04bb-44f0-be87-89af984dd880-kube-api-access-vhqpd\") pod \"keystone-db-create-jpslj\" (UID: \"109aa946-04bb-44f0-be87-89af984dd880\") " pod="openstack/keystone-db-create-jpslj" Oct 03 14:59:46 crc kubenswrapper[4774]: I1003 14:59:46.930718 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhqpd\" (UniqueName: \"kubernetes.io/projected/109aa946-04bb-44f0-be87-89af984dd880-kube-api-access-vhqpd\") pod \"keystone-db-create-jpslj\" (UID: \"109aa946-04bb-44f0-be87-89af984dd880\") " pod="openstack/keystone-db-create-jpslj" Oct 03 14:59:46 crc kubenswrapper[4774]: I1003 14:59:46.938518 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-m8z68"] Oct 03 14:59:46 crc kubenswrapper[4774]: I1003 14:59:46.939749 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m8z68" Oct 03 14:59:46 crc kubenswrapper[4774]: I1003 14:59:46.955511 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-m8z68"] Oct 03 14:59:47 crc kubenswrapper[4774]: I1003 14:59:47.009809 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j4vt\" (UniqueName: \"kubernetes.io/projected/bb71d2a9-eead-444a-ab69-cd9315317392-kube-api-access-4j4vt\") pod \"placement-db-create-m8z68\" (UID: \"bb71d2a9-eead-444a-ab69-cd9315317392\") " pod="openstack/placement-db-create-m8z68" Oct 03 14:59:47 crc kubenswrapper[4774]: I1003 14:59:47.022570 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c89d5d749-kzl8b" Oct 03 14:59:47 crc kubenswrapper[4774]: I1003 14:59:47.066733 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jpslj" Oct 03 14:59:47 crc kubenswrapper[4774]: I1003 14:59:47.111477 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j4vt\" (UniqueName: \"kubernetes.io/projected/bb71d2a9-eead-444a-ab69-cd9315317392-kube-api-access-4j4vt\") pod \"placement-db-create-m8z68\" (UID: \"bb71d2a9-eead-444a-ab69-cd9315317392\") " pod="openstack/placement-db-create-m8z68" Oct 03 14:59:47 crc kubenswrapper[4774]: I1003 14:59:47.129723 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j4vt\" (UniqueName: \"kubernetes.io/projected/bb71d2a9-eead-444a-ab69-cd9315317392-kube-api-access-4j4vt\") pod \"placement-db-create-m8z68\" (UID: \"bb71d2a9-eead-444a-ab69-cd9315317392\") " pod="openstack/placement-db-create-m8z68" Oct 03 14:59:47 crc kubenswrapper[4774]: I1003 14:59:47.166970 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-7sz4d"] Oct 03 14:59:47 crc kubenswrapper[4774]: I1003 14:59:47.168648 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7sz4d" Oct 03 14:59:47 crc kubenswrapper[4774]: I1003 14:59:47.184744 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-7sz4d"] Oct 03 14:59:47 crc kubenswrapper[4774]: I1003 14:59:47.213250 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c76gd\" (UniqueName: \"kubernetes.io/projected/0dd8b18c-1808-48ac-8375-7cc2aff05b12-kube-api-access-c76gd\") pod \"glance-db-create-7sz4d\" (UID: \"0dd8b18c-1808-48ac-8375-7cc2aff05b12\") " pod="openstack/glance-db-create-7sz4d" Oct 03 14:59:47 crc kubenswrapper[4774]: I1003 14:59:47.314915 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c76gd\" (UniqueName: \"kubernetes.io/projected/0dd8b18c-1808-48ac-8375-7cc2aff05b12-kube-api-access-c76gd\") pod \"glance-db-create-7sz4d\" (UID: \"0dd8b18c-1808-48ac-8375-7cc2aff05b12\") " pod="openstack/glance-db-create-7sz4d" Oct 03 14:59:47 crc kubenswrapper[4774]: I1003 14:59:47.330837 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m8z68" Oct 03 14:59:47 crc kubenswrapper[4774]: I1003 14:59:47.333208 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c76gd\" (UniqueName: \"kubernetes.io/projected/0dd8b18c-1808-48ac-8375-7cc2aff05b12-kube-api-access-c76gd\") pod \"glance-db-create-7sz4d\" (UID: \"0dd8b18c-1808-48ac-8375-7cc2aff05b12\") " pod="openstack/glance-db-create-7sz4d" Oct 03 14:59:47 crc kubenswrapper[4774]: I1003 14:59:47.513300 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7sz4d" Oct 03 14:59:47 crc kubenswrapper[4774]: I1003 14:59:47.540702 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-h54hg" Oct 03 14:59:47 crc kubenswrapper[4774]: I1003 14:59:47.556631 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-jpslj"] Oct 03 14:59:47 crc kubenswrapper[4774]: W1003 14:59:47.565443 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod109aa946_04bb_44f0_be87_89af984dd880.slice/crio-9622d3aad0b937e49df3b98fce7abf94a77d0d66b8bea01064dda8884ef05128 WatchSource:0}: Error finding container 9622d3aad0b937e49df3b98fce7abf94a77d0d66b8bea01064dda8884ef05128: Status 404 returned error can't find the container with id 9622d3aad0b937e49df3b98fce7abf94a77d0d66b8bea01064dda8884ef05128 Oct 03 14:59:47 crc kubenswrapper[4774]: I1003 14:59:47.599601 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c89d5d749-kzl8b"] Oct 03 14:59:47 crc kubenswrapper[4774]: I1003 14:59:47.599859 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c89d5d749-kzl8b" podUID="57a4a416-3b4b-48cb-9f31-f7fa373f25df" containerName="dnsmasq-dns" containerID="cri-o://86573619be71ee4a7de1e61fbc435b5cbb77829e578c4873438dba6416d337f9" gracePeriod=10 Oct 03 14:59:47 crc kubenswrapper[4774]: I1003 14:59:47.758670 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-m8z68"] Oct 03 14:59:47 crc kubenswrapper[4774]: W1003 14:59:47.765529 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb71d2a9_eead_444a_ab69_cd9315317392.slice/crio-ee654c9f07108d14a22a293ddb3532c04ca9b1a2b5653266f72e437769c1205e WatchSource:0}: Error finding container ee654c9f07108d14a22a293ddb3532c04ca9b1a2b5653266f72e437769c1205e: Status 404 returned error can't find the container with id ee654c9f07108d14a22a293ddb3532c04ca9b1a2b5653266f72e437769c1205e Oct 03 14:59:48 crc kubenswrapper[4774]: I1003 14:59:47.999887 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-7sz4d"] Oct 03 14:59:48 crc kubenswrapper[4774]: W1003 14:59:48.000509 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0dd8b18c_1808_48ac_8375_7cc2aff05b12.slice/crio-ebb1373e6565561c4aebf31513efc8d5be4737173b5242fd41c1243d0841e9a3 WatchSource:0}: Error finding container ebb1373e6565561c4aebf31513efc8d5be4737173b5242fd41c1243d0841e9a3: Status 404 returned error can't find the container with id ebb1373e6565561c4aebf31513efc8d5be4737173b5242fd41c1243d0841e9a3 Oct 03 14:59:48 crc kubenswrapper[4774]: I1003 14:59:48.214392 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c89d5d749-kzl8b" Oct 03 14:59:48 crc kubenswrapper[4774]: I1003 14:59:48.226838 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x96c7\" (UniqueName: \"kubernetes.io/projected/57a4a416-3b4b-48cb-9f31-f7fa373f25df-kube-api-access-x96c7\") pod \"57a4a416-3b4b-48cb-9f31-f7fa373f25df\" (UID: \"57a4a416-3b4b-48cb-9f31-f7fa373f25df\") " Oct 03 14:59:48 crc kubenswrapper[4774]: I1003 14:59:48.226896 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57a4a416-3b4b-48cb-9f31-f7fa373f25df-dns-svc\") pod \"57a4a416-3b4b-48cb-9f31-f7fa373f25df\" (UID: \"57a4a416-3b4b-48cb-9f31-f7fa373f25df\") " Oct 03 14:59:48 crc kubenswrapper[4774]: I1003 14:59:48.226931 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57a4a416-3b4b-48cb-9f31-f7fa373f25df-ovsdbserver-sb\") pod \"57a4a416-3b4b-48cb-9f31-f7fa373f25df\" (UID: \"57a4a416-3b4b-48cb-9f31-f7fa373f25df\") " Oct 03 14:59:48 crc kubenswrapper[4774]: I1003 14:59:48.226975 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57a4a416-3b4b-48cb-9f31-f7fa373f25df-config\") pod \"57a4a416-3b4b-48cb-9f31-f7fa373f25df\" (UID: \"57a4a416-3b4b-48cb-9f31-f7fa373f25df\") " Oct 03 14:59:48 crc kubenswrapper[4774]: I1003 14:59:48.232775 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a4a416-3b4b-48cb-9f31-f7fa373f25df-kube-api-access-x96c7" (OuterVolumeSpecName: "kube-api-access-x96c7") pod "57a4a416-3b4b-48cb-9f31-f7fa373f25df" (UID: "57a4a416-3b4b-48cb-9f31-f7fa373f25df"). InnerVolumeSpecName "kube-api-access-x96c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:59:48 crc kubenswrapper[4774]: I1003 14:59:48.310243 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57a4a416-3b4b-48cb-9f31-f7fa373f25df-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "57a4a416-3b4b-48cb-9f31-f7fa373f25df" (UID: "57a4a416-3b4b-48cb-9f31-f7fa373f25df"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:59:48 crc kubenswrapper[4774]: I1003 14:59:48.311246 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57a4a416-3b4b-48cb-9f31-f7fa373f25df-config" (OuterVolumeSpecName: "config") pod "57a4a416-3b4b-48cb-9f31-f7fa373f25df" (UID: "57a4a416-3b4b-48cb-9f31-f7fa373f25df"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:59:48 crc kubenswrapper[4774]: I1003 14:59:48.315365 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57a4a416-3b4b-48cb-9f31-f7fa373f25df-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "57a4a416-3b4b-48cb-9f31-f7fa373f25df" (UID: "57a4a416-3b4b-48cb-9f31-f7fa373f25df"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:59:48 crc kubenswrapper[4774]: I1003 14:59:48.328090 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x96c7\" (UniqueName: \"kubernetes.io/projected/57a4a416-3b4b-48cb-9f31-f7fa373f25df-kube-api-access-x96c7\") on node \"crc\" DevicePath \"\"" Oct 03 14:59:48 crc kubenswrapper[4774]: I1003 14:59:48.328124 4774 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57a4a416-3b4b-48cb-9f31-f7fa373f25df-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:59:48 crc kubenswrapper[4774]: I1003 14:59:48.328137 4774 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57a4a416-3b4b-48cb-9f31-f7fa373f25df-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 14:59:48 crc kubenswrapper[4774]: I1003 14:59:48.328150 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57a4a416-3b4b-48cb-9f31-f7fa373f25df-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:59:48 crc kubenswrapper[4774]: I1003 14:59:48.420829 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7sz4d" event={"ID":"0dd8b18c-1808-48ac-8375-7cc2aff05b12","Type":"ContainerStarted","Data":"ebb1373e6565561c4aebf31513efc8d5be4737173b5242fd41c1243d0841e9a3"} Oct 03 14:59:48 crc kubenswrapper[4774]: I1003 14:59:48.423075 4774 generic.go:334] "Generic (PLEG): container finished" podID="109aa946-04bb-44f0-be87-89af984dd880" containerID="291639694edbf27b3b0404aa7375f128dd2eedc659d823a68f0102c9b365ea54" exitCode=0 Oct 03 14:59:48 crc kubenswrapper[4774]: I1003 14:59:48.423198 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jpslj" event={"ID":"109aa946-04bb-44f0-be87-89af984dd880","Type":"ContainerDied","Data":"291639694edbf27b3b0404aa7375f128dd2eedc659d823a68f0102c9b365ea54"} Oct 03 14:59:48 crc kubenswrapper[4774]: I1003 14:59:48.423222 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jpslj" event={"ID":"109aa946-04bb-44f0-be87-89af984dd880","Type":"ContainerStarted","Data":"9622d3aad0b937e49df3b98fce7abf94a77d0d66b8bea01064dda8884ef05128"} Oct 03 14:59:48 crc kubenswrapper[4774]: I1003 14:59:48.424213 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m8z68" event={"ID":"bb71d2a9-eead-444a-ab69-cd9315317392","Type":"ContainerStarted","Data":"03fffac536e16617026539d41dbde80ddf4cb3b5b0cce719929c5aecfc217adf"} Oct 03 14:59:48 crc kubenswrapper[4774]: I1003 14:59:48.424254 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m8z68" event={"ID":"bb71d2a9-eead-444a-ab69-cd9315317392","Type":"ContainerStarted","Data":"ee654c9f07108d14a22a293ddb3532c04ca9b1a2b5653266f72e437769c1205e"} Oct 03 14:59:48 crc kubenswrapper[4774]: I1003 14:59:48.427258 4774 generic.go:334] "Generic (PLEG): container finished" podID="57a4a416-3b4b-48cb-9f31-f7fa373f25df" containerID="86573619be71ee4a7de1e61fbc435b5cbb77829e578c4873438dba6416d337f9" exitCode=0 Oct 03 14:59:48 crc kubenswrapper[4774]: I1003 14:59:48.427298 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c89d5d749-kzl8b" event={"ID":"57a4a416-3b4b-48cb-9f31-f7fa373f25df","Type":"ContainerDied","Data":"86573619be71ee4a7de1e61fbc435b5cbb77829e578c4873438dba6416d337f9"} Oct 03 14:59:48 crc kubenswrapper[4774]: I1003 14:59:48.427321 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c89d5d749-kzl8b" event={"ID":"57a4a416-3b4b-48cb-9f31-f7fa373f25df","Type":"ContainerDied","Data":"303d00979e9a5f7227409852a9d6105d2b531168e21aef835ac1bbb89d32d4e4"} Oct 03 14:59:48 crc kubenswrapper[4774]: I1003 14:59:48.427337 4774 scope.go:117] "RemoveContainer" containerID="86573619be71ee4a7de1e61fbc435b5cbb77829e578c4873438dba6416d337f9" Oct 03 14:59:48 crc kubenswrapper[4774]: I1003 14:59:48.427354 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c89d5d749-kzl8b" Oct 03 14:59:48 crc kubenswrapper[4774]: I1003 14:59:48.444577 4774 scope.go:117] "RemoveContainer" containerID="05346c83c99d951814371acf83b6147466553a0e3812f207f91f37d8c524eaa1" Oct 03 14:59:48 crc kubenswrapper[4774]: I1003 14:59:48.462264 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-m8z68" podStartSLOduration=2.4622454400000002 podStartE2EDuration="2.46224544s" podCreationTimestamp="2025-10-03 14:59:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:59:48.453523593 +0000 UTC m=+1011.042727085" watchObservedRunningTime="2025-10-03 14:59:48.46224544 +0000 UTC m=+1011.051448902" Oct 03 14:59:48 crc kubenswrapper[4774]: I1003 14:59:48.473573 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c89d5d749-kzl8b"] Oct 03 14:59:48 crc kubenswrapper[4774]: I1003 14:59:48.480346 4774 scope.go:117] "RemoveContainer" containerID="86573619be71ee4a7de1e61fbc435b5cbb77829e578c4873438dba6416d337f9" Oct 03 14:59:48 crc kubenswrapper[4774]: E1003 14:59:48.480841 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86573619be71ee4a7de1e61fbc435b5cbb77829e578c4873438dba6416d337f9\": container with ID starting with 86573619be71ee4a7de1e61fbc435b5cbb77829e578c4873438dba6416d337f9 not found: ID does not exist" containerID="86573619be71ee4a7de1e61fbc435b5cbb77829e578c4873438dba6416d337f9" Oct 03 14:59:48 crc kubenswrapper[4774]: I1003 14:59:48.480875 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86573619be71ee4a7de1e61fbc435b5cbb77829e578c4873438dba6416d337f9"} err="failed to get container status \"86573619be71ee4a7de1e61fbc435b5cbb77829e578c4873438dba6416d337f9\": rpc error: code = NotFound desc = could not find container \"86573619be71ee4a7de1e61fbc435b5cbb77829e578c4873438dba6416d337f9\": container with ID starting with 86573619be71ee4a7de1e61fbc435b5cbb77829e578c4873438dba6416d337f9 not found: ID does not exist" Oct 03 14:59:48 crc kubenswrapper[4774]: I1003 14:59:48.480898 4774 scope.go:117] "RemoveContainer" containerID="05346c83c99d951814371acf83b6147466553a0e3812f207f91f37d8c524eaa1" Oct 03 14:59:48 crc kubenswrapper[4774]: E1003 14:59:48.481258 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05346c83c99d951814371acf83b6147466553a0e3812f207f91f37d8c524eaa1\": container with ID starting with 05346c83c99d951814371acf83b6147466553a0e3812f207f91f37d8c524eaa1 not found: ID does not exist" containerID="05346c83c99d951814371acf83b6147466553a0e3812f207f91f37d8c524eaa1" Oct 03 14:59:48 crc kubenswrapper[4774]: I1003 14:59:48.481314 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c89d5d749-kzl8b"] Oct 03 14:59:48 crc kubenswrapper[4774]: I1003 14:59:48.481314 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05346c83c99d951814371acf83b6147466553a0e3812f207f91f37d8c524eaa1"} err="failed to get container status \"05346c83c99d951814371acf83b6147466553a0e3812f207f91f37d8c524eaa1\": rpc error: code = NotFound desc = could not find container \"05346c83c99d951814371acf83b6147466553a0e3812f207f91f37d8c524eaa1\": container with ID starting with 05346c83c99d951814371acf83b6147466553a0e3812f207f91f37d8c524eaa1 not found: ID does not exist" Oct 03 14:59:49 crc kubenswrapper[4774]: I1003 14:59:49.315258 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a4a416-3b4b-48cb-9f31-f7fa373f25df" path="/var/lib/kubelet/pods/57a4a416-3b4b-48cb-9f31-f7fa373f25df/volumes" Oct 03 14:59:49 crc kubenswrapper[4774]: I1003 14:59:49.440171 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7sz4d" event={"ID":"0dd8b18c-1808-48ac-8375-7cc2aff05b12","Type":"ContainerStarted","Data":"52728a933e58597fffa7010cd561662a26848eab1cc304ee752cd60cc1acdf00"} Oct 03 14:59:49 crc kubenswrapper[4774]: I1003 14:59:49.831812 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jpslj" Oct 03 14:59:49 crc kubenswrapper[4774]: I1003 14:59:49.959162 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhqpd\" (UniqueName: \"kubernetes.io/projected/109aa946-04bb-44f0-be87-89af984dd880-kube-api-access-vhqpd\") pod \"109aa946-04bb-44f0-be87-89af984dd880\" (UID: \"109aa946-04bb-44f0-be87-89af984dd880\") " Oct 03 14:59:49 crc kubenswrapper[4774]: I1003 14:59:49.965135 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/109aa946-04bb-44f0-be87-89af984dd880-kube-api-access-vhqpd" (OuterVolumeSpecName: "kube-api-access-vhqpd") pod "109aa946-04bb-44f0-be87-89af984dd880" (UID: "109aa946-04bb-44f0-be87-89af984dd880"). InnerVolumeSpecName "kube-api-access-vhqpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:59:50 crc kubenswrapper[4774]: I1003 14:59:50.061727 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhqpd\" (UniqueName: \"kubernetes.io/projected/109aa946-04bb-44f0-be87-89af984dd880-kube-api-access-vhqpd\") on node \"crc\" DevicePath \"\"" Oct 03 14:59:50 crc kubenswrapper[4774]: I1003 14:59:50.456707 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jpslj" event={"ID":"109aa946-04bb-44f0-be87-89af984dd880","Type":"ContainerDied","Data":"9622d3aad0b937e49df3b98fce7abf94a77d0d66b8bea01064dda8884ef05128"} Oct 03 14:59:50 crc kubenswrapper[4774]: I1003 14:59:50.456783 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jpslj" Oct 03 14:59:50 crc kubenswrapper[4774]: I1003 14:59:50.456787 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9622d3aad0b937e49df3b98fce7abf94a77d0d66b8bea01064dda8884ef05128" Oct 03 14:59:50 crc kubenswrapper[4774]: I1003 14:59:50.653292 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:59:50 crc kubenswrapper[4774]: I1003 14:59:50.653358 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:59:50 crc kubenswrapper[4774]: I1003 14:59:50.653433 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" Oct 03 14:59:50 crc kubenswrapper[4774]: I1003 14:59:50.654137 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f6858585d748d7516503bd2f90216465db181a380255963647c750b70d73b203"} pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 14:59:50 crc kubenswrapper[4774]: I1003 14:59:50.654205 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" containerID="cri-o://f6858585d748d7516503bd2f90216465db181a380255963647c750b70d73b203" gracePeriod=600 Oct 03 14:59:51 crc kubenswrapper[4774]: I1003 14:59:51.492627 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-7sz4d" podStartSLOduration=4.492597063 podStartE2EDuration="4.492597063s" podCreationTimestamp="2025-10-03 14:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:59:51.49088067 +0000 UTC m=+1014.080084142" watchObservedRunningTime="2025-10-03 14:59:51.492597063 +0000 UTC m=+1014.081800515" Oct 03 14:59:52 crc kubenswrapper[4774]: I1003 14:59:52.476868 4774 generic.go:334] "Generic (PLEG): container finished" podID="bb71d2a9-eead-444a-ab69-cd9315317392" containerID="03fffac536e16617026539d41dbde80ddf4cb3b5b0cce719929c5aecfc217adf" exitCode=0 Oct 03 14:59:52 crc kubenswrapper[4774]: I1003 14:59:52.476942 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m8z68" event={"ID":"bb71d2a9-eead-444a-ab69-cd9315317392","Type":"ContainerDied","Data":"03fffac536e16617026539d41dbde80ddf4cb3b5b0cce719929c5aecfc217adf"} Oct 03 14:59:52 crc kubenswrapper[4774]: I1003 14:59:52.479978 4774 generic.go:334] "Generic (PLEG): container finished" podID="0dd8b18c-1808-48ac-8375-7cc2aff05b12" containerID="52728a933e58597fffa7010cd561662a26848eab1cc304ee752cd60cc1acdf00" exitCode=0 Oct 03 14:59:52 crc kubenswrapper[4774]: I1003 14:59:52.480024 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7sz4d" event={"ID":"0dd8b18c-1808-48ac-8375-7cc2aff05b12","Type":"ContainerDied","Data":"52728a933e58597fffa7010cd561662a26848eab1cc304ee752cd60cc1acdf00"} Oct 03 14:59:52 crc kubenswrapper[4774]: I1003 14:59:52.483559 4774 generic.go:334] "Generic (PLEG): container finished" podID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerID="f6858585d748d7516503bd2f90216465db181a380255963647c750b70d73b203" exitCode=0 Oct 03 14:59:52 crc kubenswrapper[4774]: I1003 14:59:52.483636 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" event={"ID":"ca37ac4b-f421-4198-a179-12901d36f0f5","Type":"ContainerDied","Data":"f6858585d748d7516503bd2f90216465db181a380255963647c750b70d73b203"} Oct 03 14:59:52 crc kubenswrapper[4774]: I1003 14:59:52.483790 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" event={"ID":"ca37ac4b-f421-4198-a179-12901d36f0f5","Type":"ContainerStarted","Data":"3d631133d606feac7cf551b661ad83ff63af803d9385eff9dee1aa2b7ab7a1cd"} Oct 03 14:59:52 crc kubenswrapper[4774]: I1003 14:59:52.483876 4774 scope.go:117] "RemoveContainer" containerID="dd800268763caf1ba49e9f09998c3c8de0daa9481a442c7e1127db9996ab98ab" Oct 03 14:59:53 crc kubenswrapper[4774]: I1003 14:59:53.041115 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 03 14:59:53 crc kubenswrapper[4774]: I1003 14:59:53.507046 4774 generic.go:334] "Generic (PLEG): container finished" podID="5244bd24-f205-4576-a7cd-6da859f28e21" containerID="aed5639b19e590a01510f40d3a49a4107ad88cc8b2aa7f3d2a9be5595631606b" exitCode=0 Oct 03 14:59:53 crc kubenswrapper[4774]: I1003 14:59:53.507675 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xxj5b" event={"ID":"5244bd24-f205-4576-a7cd-6da859f28e21","Type":"ContainerDied","Data":"aed5639b19e590a01510f40d3a49a4107ad88cc8b2aa7f3d2a9be5595631606b"} Oct 03 14:59:53 crc kubenswrapper[4774]: I1003 14:59:53.949139 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m8z68" Oct 03 14:59:53 crc kubenswrapper[4774]: I1003 14:59:53.956802 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7sz4d" Oct 03 14:59:54 crc kubenswrapper[4774]: I1003 14:59:54.042561 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4j4vt\" (UniqueName: \"kubernetes.io/projected/bb71d2a9-eead-444a-ab69-cd9315317392-kube-api-access-4j4vt\") pod \"bb71d2a9-eead-444a-ab69-cd9315317392\" (UID: \"bb71d2a9-eead-444a-ab69-cd9315317392\") " Oct 03 14:59:54 crc kubenswrapper[4774]: I1003 14:59:54.042704 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c76gd\" (UniqueName: \"kubernetes.io/projected/0dd8b18c-1808-48ac-8375-7cc2aff05b12-kube-api-access-c76gd\") pod \"0dd8b18c-1808-48ac-8375-7cc2aff05b12\" (UID: \"0dd8b18c-1808-48ac-8375-7cc2aff05b12\") " Oct 03 14:59:54 crc kubenswrapper[4774]: I1003 14:59:54.048480 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb71d2a9-eead-444a-ab69-cd9315317392-kube-api-access-4j4vt" (OuterVolumeSpecName: "kube-api-access-4j4vt") pod "bb71d2a9-eead-444a-ab69-cd9315317392" (UID: "bb71d2a9-eead-444a-ab69-cd9315317392"). InnerVolumeSpecName "kube-api-access-4j4vt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:59:54 crc kubenswrapper[4774]: I1003 14:59:54.057842 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dd8b18c-1808-48ac-8375-7cc2aff05b12-kube-api-access-c76gd" (OuterVolumeSpecName: "kube-api-access-c76gd") pod "0dd8b18c-1808-48ac-8375-7cc2aff05b12" (UID: "0dd8b18c-1808-48ac-8375-7cc2aff05b12"). InnerVolumeSpecName "kube-api-access-c76gd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:59:54 crc kubenswrapper[4774]: I1003 14:59:54.144943 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c76gd\" (UniqueName: \"kubernetes.io/projected/0dd8b18c-1808-48ac-8375-7cc2aff05b12-kube-api-access-c76gd\") on node \"crc\" DevicePath \"\"" Oct 03 14:59:54 crc kubenswrapper[4774]: I1003 14:59:54.144984 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4j4vt\" (UniqueName: \"kubernetes.io/projected/bb71d2a9-eead-444a-ab69-cd9315317392-kube-api-access-4j4vt\") on node \"crc\" DevicePath \"\"" Oct 03 14:59:54 crc kubenswrapper[4774]: I1003 14:59:54.531286 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7sz4d" Oct 03 14:59:54 crc kubenswrapper[4774]: I1003 14:59:54.531293 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7sz4d" event={"ID":"0dd8b18c-1808-48ac-8375-7cc2aff05b12","Type":"ContainerDied","Data":"ebb1373e6565561c4aebf31513efc8d5be4737173b5242fd41c1243d0841e9a3"} Oct 03 14:59:54 crc kubenswrapper[4774]: I1003 14:59:54.531917 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebb1373e6565561c4aebf31513efc8d5be4737173b5242fd41c1243d0841e9a3" Oct 03 14:59:54 crc kubenswrapper[4774]: I1003 14:59:54.534418 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m8z68" event={"ID":"bb71d2a9-eead-444a-ab69-cd9315317392","Type":"ContainerDied","Data":"ee654c9f07108d14a22a293ddb3532c04ca9b1a2b5653266f72e437769c1205e"} Oct 03 14:59:54 crc kubenswrapper[4774]: I1003 14:59:54.534457 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee654c9f07108d14a22a293ddb3532c04ca9b1a2b5653266f72e437769c1205e" Oct 03 14:59:54 crc kubenswrapper[4774]: I1003 14:59:54.534459 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m8z68" Oct 03 14:59:54 crc kubenswrapper[4774]: I1003 14:59:54.926940 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xxj5b" Oct 03 14:59:54 crc kubenswrapper[4774]: I1003 14:59:54.960056 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5244bd24-f205-4576-a7cd-6da859f28e21-swiftconf\") pod \"5244bd24-f205-4576-a7cd-6da859f28e21\" (UID: \"5244bd24-f205-4576-a7cd-6da859f28e21\") " Oct 03 14:59:54 crc kubenswrapper[4774]: I1003 14:59:54.960153 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5244bd24-f205-4576-a7cd-6da859f28e21-etc-swift\") pod \"5244bd24-f205-4576-a7cd-6da859f28e21\" (UID: \"5244bd24-f205-4576-a7cd-6da859f28e21\") " Oct 03 14:59:54 crc kubenswrapper[4774]: I1003 14:59:54.960193 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5244bd24-f205-4576-a7cd-6da859f28e21-ring-data-devices\") pod \"5244bd24-f205-4576-a7cd-6da859f28e21\" (UID: \"5244bd24-f205-4576-a7cd-6da859f28e21\") " Oct 03 14:59:54 crc kubenswrapper[4774]: I1003 14:59:54.960304 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5244bd24-f205-4576-a7cd-6da859f28e21-dispersionconf\") pod \"5244bd24-f205-4576-a7cd-6da859f28e21\" (UID: \"5244bd24-f205-4576-a7cd-6da859f28e21\") " Oct 03 14:59:54 crc kubenswrapper[4774]: I1003 14:59:54.960351 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5244bd24-f205-4576-a7cd-6da859f28e21-scripts\") pod \"5244bd24-f205-4576-a7cd-6da859f28e21\" (UID: \"5244bd24-f205-4576-a7cd-6da859f28e21\") " Oct 03 14:59:54 crc kubenswrapper[4774]: I1003 14:59:54.960412 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k624m\" (UniqueName: \"kubernetes.io/projected/5244bd24-f205-4576-a7cd-6da859f28e21-kube-api-access-k624m\") pod \"5244bd24-f205-4576-a7cd-6da859f28e21\" (UID: \"5244bd24-f205-4576-a7cd-6da859f28e21\") " Oct 03 14:59:54 crc kubenswrapper[4774]: I1003 14:59:54.960497 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5244bd24-f205-4576-a7cd-6da859f28e21-combined-ca-bundle\") pod \"5244bd24-f205-4576-a7cd-6da859f28e21\" (UID: \"5244bd24-f205-4576-a7cd-6da859f28e21\") " Oct 03 14:59:54 crc kubenswrapper[4774]: I1003 14:59:54.962343 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5244bd24-f205-4576-a7cd-6da859f28e21-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "5244bd24-f205-4576-a7cd-6da859f28e21" (UID: "5244bd24-f205-4576-a7cd-6da859f28e21"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:59:54 crc kubenswrapper[4774]: I1003 14:59:54.964341 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5244bd24-f205-4576-a7cd-6da859f28e21-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "5244bd24-f205-4576-a7cd-6da859f28e21" (UID: "5244bd24-f205-4576-a7cd-6da859f28e21"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:59:54 crc kubenswrapper[4774]: I1003 14:59:54.980568 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5244bd24-f205-4576-a7cd-6da859f28e21-kube-api-access-k624m" (OuterVolumeSpecName: "kube-api-access-k624m") pod "5244bd24-f205-4576-a7cd-6da859f28e21" (UID: "5244bd24-f205-4576-a7cd-6da859f28e21"). InnerVolumeSpecName "kube-api-access-k624m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:59:54 crc kubenswrapper[4774]: I1003 14:59:54.985024 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5244bd24-f205-4576-a7cd-6da859f28e21-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "5244bd24-f205-4576-a7cd-6da859f28e21" (UID: "5244bd24-f205-4576-a7cd-6da859f28e21"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:59:54 crc kubenswrapper[4774]: I1003 14:59:54.994616 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5244bd24-f205-4576-a7cd-6da859f28e21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5244bd24-f205-4576-a7cd-6da859f28e21" (UID: "5244bd24-f205-4576-a7cd-6da859f28e21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:59:54 crc kubenswrapper[4774]: I1003 14:59:54.995808 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5244bd24-f205-4576-a7cd-6da859f28e21-scripts" (OuterVolumeSpecName: "scripts") pod "5244bd24-f205-4576-a7cd-6da859f28e21" (UID: "5244bd24-f205-4576-a7cd-6da859f28e21"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:59:55 crc kubenswrapper[4774]: I1003 14:59:55.018483 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5244bd24-f205-4576-a7cd-6da859f28e21-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "5244bd24-f205-4576-a7cd-6da859f28e21" (UID: "5244bd24-f205-4576-a7cd-6da859f28e21"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:59:55 crc kubenswrapper[4774]: I1003 14:59:55.062092 4774 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5244bd24-f205-4576-a7cd-6da859f28e21-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 03 14:59:55 crc kubenswrapper[4774]: I1003 14:59:55.062131 4774 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5244bd24-f205-4576-a7cd-6da859f28e21-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 03 14:59:55 crc kubenswrapper[4774]: I1003 14:59:55.062146 4774 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5244bd24-f205-4576-a7cd-6da859f28e21-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 03 14:59:55 crc kubenswrapper[4774]: I1003 14:59:55.062160 4774 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5244bd24-f205-4576-a7cd-6da859f28e21-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 03 14:59:55 crc kubenswrapper[4774]: I1003 14:59:55.062173 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5244bd24-f205-4576-a7cd-6da859f28e21-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:59:55 crc kubenswrapper[4774]: I1003 14:59:55.062186 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k624m\" (UniqueName: \"kubernetes.io/projected/5244bd24-f205-4576-a7cd-6da859f28e21-kube-api-access-k624m\") on node \"crc\" DevicePath \"\"" Oct 03 14:59:55 crc kubenswrapper[4774]: I1003 14:59:55.062197 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5244bd24-f205-4576-a7cd-6da859f28e21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:59:55 crc kubenswrapper[4774]: I1003 14:59:55.543827 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xxj5b" event={"ID":"5244bd24-f205-4576-a7cd-6da859f28e21","Type":"ContainerDied","Data":"841d825bb85699b2d63c1f7810438c161344edf5a91a1b3f0b0b7b07ab84bf07"} Oct 03 14:59:55 crc kubenswrapper[4774]: I1003 14:59:55.543879 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="841d825bb85699b2d63c1f7810438c161344edf5a91a1b3f0b0b7b07ab84bf07" Oct 03 14:59:55 crc kubenswrapper[4774]: I1003 14:59:55.543884 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xxj5b" Oct 03 14:59:56 crc kubenswrapper[4774]: I1003 14:59:56.776563 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-eb1a-account-create-6smf5"] Oct 03 14:59:56 crc kubenswrapper[4774]: E1003 14:59:56.777397 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57a4a416-3b4b-48cb-9f31-f7fa373f25df" containerName="dnsmasq-dns" Oct 03 14:59:56 crc kubenswrapper[4774]: I1003 14:59:56.777416 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a4a416-3b4b-48cb-9f31-f7fa373f25df" containerName="dnsmasq-dns" Oct 03 14:59:56 crc kubenswrapper[4774]: E1003 14:59:56.777436 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="109aa946-04bb-44f0-be87-89af984dd880" containerName="mariadb-database-create" Oct 03 14:59:56 crc kubenswrapper[4774]: I1003 14:59:56.777445 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="109aa946-04bb-44f0-be87-89af984dd880" containerName="mariadb-database-create" Oct 03 14:59:56 crc kubenswrapper[4774]: E1003 14:59:56.777480 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57a4a416-3b4b-48cb-9f31-f7fa373f25df" containerName="init" Oct 03 14:59:56 crc kubenswrapper[4774]: I1003 14:59:56.777491 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a4a416-3b4b-48cb-9f31-f7fa373f25df" containerName="init" Oct 03 14:59:56 crc kubenswrapper[4774]: E1003 14:59:56.777508 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5244bd24-f205-4576-a7cd-6da859f28e21" containerName="swift-ring-rebalance" Oct 03 14:59:56 crc kubenswrapper[4774]: I1003 14:59:56.777517 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="5244bd24-f205-4576-a7cd-6da859f28e21" containerName="swift-ring-rebalance" Oct 03 14:59:56 crc kubenswrapper[4774]: E1003 14:59:56.777530 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dd8b18c-1808-48ac-8375-7cc2aff05b12" containerName="mariadb-database-create" Oct 03 14:59:56 crc kubenswrapper[4774]: I1003 14:59:56.777539 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dd8b18c-1808-48ac-8375-7cc2aff05b12" containerName="mariadb-database-create" Oct 03 14:59:56 crc kubenswrapper[4774]: E1003 14:59:56.777555 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb71d2a9-eead-444a-ab69-cd9315317392" containerName="mariadb-database-create" Oct 03 14:59:56 crc kubenswrapper[4774]: I1003 14:59:56.777563 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb71d2a9-eead-444a-ab69-cd9315317392" containerName="mariadb-database-create" Oct 03 14:59:56 crc kubenswrapper[4774]: I1003 14:59:56.777763 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb71d2a9-eead-444a-ab69-cd9315317392" containerName="mariadb-database-create" Oct 03 14:59:56 crc kubenswrapper[4774]: I1003 14:59:56.777779 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dd8b18c-1808-48ac-8375-7cc2aff05b12" containerName="mariadb-database-create" Oct 03 14:59:56 crc kubenswrapper[4774]: I1003 14:59:56.777798 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="109aa946-04bb-44f0-be87-89af984dd880" containerName="mariadb-database-create" Oct 03 14:59:56 crc kubenswrapper[4774]: I1003 14:59:56.777814 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="57a4a416-3b4b-48cb-9f31-f7fa373f25df" containerName="dnsmasq-dns" Oct 03 14:59:56 crc kubenswrapper[4774]: I1003 14:59:56.777829 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="5244bd24-f205-4576-a7cd-6da859f28e21" containerName="swift-ring-rebalance" Oct 03 14:59:56 crc kubenswrapper[4774]: I1003 14:59:56.778472 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-eb1a-account-create-6smf5" Oct 03 14:59:56 crc kubenswrapper[4774]: I1003 14:59:56.782363 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 03 14:59:56 crc kubenswrapper[4774]: I1003 14:59:56.790308 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf27b\" (UniqueName: \"kubernetes.io/projected/40ae99e2-2da0-4b3c-b1e4-92101379dcbf-kube-api-access-xf27b\") pod \"keystone-eb1a-account-create-6smf5\" (UID: \"40ae99e2-2da0-4b3c-b1e4-92101379dcbf\") " pod="openstack/keystone-eb1a-account-create-6smf5" Oct 03 14:59:56 crc kubenswrapper[4774]: I1003 14:59:56.790351 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-eb1a-account-create-6smf5"] Oct 03 14:59:56 crc kubenswrapper[4774]: I1003 14:59:56.892140 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf27b\" (UniqueName: \"kubernetes.io/projected/40ae99e2-2da0-4b3c-b1e4-92101379dcbf-kube-api-access-xf27b\") pod \"keystone-eb1a-account-create-6smf5\" (UID: \"40ae99e2-2da0-4b3c-b1e4-92101379dcbf\") " pod="openstack/keystone-eb1a-account-create-6smf5" Oct 03 14:59:56 crc kubenswrapper[4774]: I1003 14:59:56.921104 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf27b\" (UniqueName: \"kubernetes.io/projected/40ae99e2-2da0-4b3c-b1e4-92101379dcbf-kube-api-access-xf27b\") pod \"keystone-eb1a-account-create-6smf5\" (UID: \"40ae99e2-2da0-4b3c-b1e4-92101379dcbf\") " pod="openstack/keystone-eb1a-account-create-6smf5" Oct 03 14:59:57 crc kubenswrapper[4774]: I1003 14:59:57.108501 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-eb1a-account-create-6smf5" Oct 03 14:59:57 crc kubenswrapper[4774]: I1003 14:59:57.305801 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-c7c3-account-create-97bj7"] Oct 03 14:59:57 crc kubenswrapper[4774]: I1003 14:59:57.308757 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c7c3-account-create-97bj7" Oct 03 14:59:57 crc kubenswrapper[4774]: I1003 14:59:57.317768 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 03 14:59:57 crc kubenswrapper[4774]: I1003 14:59:57.339107 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c7c3-account-create-97bj7"] Oct 03 14:59:57 crc kubenswrapper[4774]: I1003 14:59:57.408026 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdsfd\" (UniqueName: \"kubernetes.io/projected/1fc94371-a912-4891-a328-1bc2b840fa61-kube-api-access-rdsfd\") pod \"glance-c7c3-account-create-97bj7\" (UID: \"1fc94371-a912-4891-a328-1bc2b840fa61\") " pod="openstack/glance-c7c3-account-create-97bj7" Oct 03 14:59:57 crc kubenswrapper[4774]: I1003 14:59:57.509541 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdsfd\" (UniqueName: \"kubernetes.io/projected/1fc94371-a912-4891-a328-1bc2b840fa61-kube-api-access-rdsfd\") pod \"glance-c7c3-account-create-97bj7\" (UID: \"1fc94371-a912-4891-a328-1bc2b840fa61\") " pod="openstack/glance-c7c3-account-create-97bj7" Oct 03 14:59:57 crc kubenswrapper[4774]: I1003 14:59:57.529085 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdsfd\" (UniqueName: \"kubernetes.io/projected/1fc94371-a912-4891-a328-1bc2b840fa61-kube-api-access-rdsfd\") pod \"glance-c7c3-account-create-97bj7\" (UID: \"1fc94371-a912-4891-a328-1bc2b840fa61\") " pod="openstack/glance-c7c3-account-create-97bj7" Oct 03 14:59:57 crc kubenswrapper[4774]: I1003 14:59:57.603511 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-eb1a-account-create-6smf5"] Oct 03 14:59:57 crc kubenswrapper[4774]: W1003 14:59:57.615523 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40ae99e2_2da0_4b3c_b1e4_92101379dcbf.slice/crio-f1e82b0a8f674edee007d6041a6116b189f85481b2f57f508bd43fbee51806f5 WatchSource:0}: Error finding container f1e82b0a8f674edee007d6041a6116b189f85481b2f57f508bd43fbee51806f5: Status 404 returned error can't find the container with id f1e82b0a8f674edee007d6041a6116b189f85481b2f57f508bd43fbee51806f5 Oct 03 14:59:57 crc kubenswrapper[4774]: I1003 14:59:57.643526 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c7c3-account-create-97bj7" Oct 03 14:59:58 crc kubenswrapper[4774]: I1003 14:59:58.155403 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c7c3-account-create-97bj7"] Oct 03 14:59:58 crc kubenswrapper[4774]: W1003 14:59:58.160050 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fc94371_a912_4891_a328_1bc2b840fa61.slice/crio-26015a48e1f3d14a1182e957cc5fb9064f18ea88655aa2ad6e6e574b6649cdc2 WatchSource:0}: Error finding container 26015a48e1f3d14a1182e957cc5fb9064f18ea88655aa2ad6e6e574b6649cdc2: Status 404 returned error can't find the container with id 26015a48e1f3d14a1182e957cc5fb9064f18ea88655aa2ad6e6e574b6649cdc2 Oct 03 14:59:58 crc kubenswrapper[4774]: I1003 14:59:58.574755 4774 generic.go:334] "Generic (PLEG): container finished" podID="40ae99e2-2da0-4b3c-b1e4-92101379dcbf" containerID="6bed31e5afaa827109722d902bec7fe86975cf90c18bbb62ab8c3a3f56f37b11" exitCode=0 Oct 03 14:59:58 crc kubenswrapper[4774]: I1003 14:59:58.574814 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-eb1a-account-create-6smf5" event={"ID":"40ae99e2-2da0-4b3c-b1e4-92101379dcbf","Type":"ContainerDied","Data":"6bed31e5afaa827109722d902bec7fe86975cf90c18bbb62ab8c3a3f56f37b11"} Oct 03 14:59:58 crc kubenswrapper[4774]: I1003 14:59:58.575095 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-eb1a-account-create-6smf5" event={"ID":"40ae99e2-2da0-4b3c-b1e4-92101379dcbf","Type":"ContainerStarted","Data":"f1e82b0a8f674edee007d6041a6116b189f85481b2f57f508bd43fbee51806f5"} Oct 03 14:59:58 crc kubenswrapper[4774]: I1003 14:59:58.577536 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c7c3-account-create-97bj7" event={"ID":"1fc94371-a912-4891-a328-1bc2b840fa61","Type":"ContainerStarted","Data":"701fe0ed81ae955a0e61706167acc81a8fac0753da1cee9ba2afb8b71a01d0cd"} Oct 03 14:59:58 crc kubenswrapper[4774]: I1003 14:59:58.577576 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c7c3-account-create-97bj7" event={"ID":"1fc94371-a912-4891-a328-1bc2b840fa61","Type":"ContainerStarted","Data":"26015a48e1f3d14a1182e957cc5fb9064f18ea88655aa2ad6e6e574b6649cdc2"} Oct 03 14:59:58 crc kubenswrapper[4774]: I1003 14:59:58.613640 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-c7c3-account-create-97bj7" podStartSLOduration=1.613620992 podStartE2EDuration="1.613620992s" podCreationTimestamp="2025-10-03 14:59:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:59:58.603769197 +0000 UTC m=+1021.192972669" watchObservedRunningTime="2025-10-03 14:59:58.613620992 +0000 UTC m=+1021.202824454" Oct 03 14:59:59 crc kubenswrapper[4774]: I1003 14:59:59.593192 4774 generic.go:334] "Generic (PLEG): container finished" podID="1fc94371-a912-4891-a328-1bc2b840fa61" containerID="701fe0ed81ae955a0e61706167acc81a8fac0753da1cee9ba2afb8b71a01d0cd" exitCode=0 Oct 03 14:59:59 crc kubenswrapper[4774]: I1003 14:59:59.593287 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c7c3-account-create-97bj7" event={"ID":"1fc94371-a912-4891-a328-1bc2b840fa61","Type":"ContainerDied","Data":"701fe0ed81ae955a0e61706167acc81a8fac0753da1cee9ba2afb8b71a01d0cd"} Oct 03 14:59:59 crc kubenswrapper[4774]: I1003 14:59:59.996655 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-eb1a-account-create-6smf5" Oct 03 15:00:00 crc kubenswrapper[4774]: I1003 15:00:00.129768 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325060-lqwfp"] Oct 03 15:00:00 crc kubenswrapper[4774]: E1003 15:00:00.130160 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40ae99e2-2da0-4b3c-b1e4-92101379dcbf" containerName="mariadb-account-create" Oct 03 15:00:00 crc kubenswrapper[4774]: I1003 15:00:00.130181 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="40ae99e2-2da0-4b3c-b1e4-92101379dcbf" containerName="mariadb-account-create" Oct 03 15:00:00 crc kubenswrapper[4774]: I1003 15:00:00.130402 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="40ae99e2-2da0-4b3c-b1e4-92101379dcbf" containerName="mariadb-account-create" Oct 03 15:00:00 crc kubenswrapper[4774]: I1003 15:00:00.130959 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-lqwfp" Oct 03 15:00:00 crc kubenswrapper[4774]: I1003 15:00:00.132964 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 15:00:00 crc kubenswrapper[4774]: I1003 15:00:00.133674 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 15:00:00 crc kubenswrapper[4774]: I1003 15:00:00.146889 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325060-lqwfp"] Oct 03 15:00:00 crc kubenswrapper[4774]: I1003 15:00:00.161956 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xf27b\" (UniqueName: \"kubernetes.io/projected/40ae99e2-2da0-4b3c-b1e4-92101379dcbf-kube-api-access-xf27b\") pod \"40ae99e2-2da0-4b3c-b1e4-92101379dcbf\" (UID: \"40ae99e2-2da0-4b3c-b1e4-92101379dcbf\") " Oct 03 15:00:00 crc kubenswrapper[4774]: I1003 15:00:00.162340 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v99s\" (UniqueName: \"kubernetes.io/projected/ba5eed25-71ac-44e3-bf15-daf7b9ac13c6-kube-api-access-8v99s\") pod \"collect-profiles-29325060-lqwfp\" (UID: \"ba5eed25-71ac-44e3-bf15-daf7b9ac13c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-lqwfp" Oct 03 15:00:00 crc kubenswrapper[4774]: I1003 15:00:00.162467 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba5eed25-71ac-44e3-bf15-daf7b9ac13c6-config-volume\") pod \"collect-profiles-29325060-lqwfp\" (UID: \"ba5eed25-71ac-44e3-bf15-daf7b9ac13c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-lqwfp" Oct 03 15:00:00 crc kubenswrapper[4774]: I1003 15:00:00.162516 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ba5eed25-71ac-44e3-bf15-daf7b9ac13c6-secret-volume\") pod \"collect-profiles-29325060-lqwfp\" (UID: \"ba5eed25-71ac-44e3-bf15-daf7b9ac13c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-lqwfp" Oct 03 15:00:00 crc kubenswrapper[4774]: I1003 15:00:00.166555 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40ae99e2-2da0-4b3c-b1e4-92101379dcbf-kube-api-access-xf27b" (OuterVolumeSpecName: "kube-api-access-xf27b") pod "40ae99e2-2da0-4b3c-b1e4-92101379dcbf" (UID: "40ae99e2-2da0-4b3c-b1e4-92101379dcbf"). InnerVolumeSpecName "kube-api-access-xf27b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:00:00 crc kubenswrapper[4774]: I1003 15:00:00.263980 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba5eed25-71ac-44e3-bf15-daf7b9ac13c6-config-volume\") pod \"collect-profiles-29325060-lqwfp\" (UID: \"ba5eed25-71ac-44e3-bf15-daf7b9ac13c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-lqwfp" Oct 03 15:00:00 crc kubenswrapper[4774]: I1003 15:00:00.264028 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ba5eed25-71ac-44e3-bf15-daf7b9ac13c6-secret-volume\") pod \"collect-profiles-29325060-lqwfp\" (UID: \"ba5eed25-71ac-44e3-bf15-daf7b9ac13c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-lqwfp" Oct 03 15:00:00 crc kubenswrapper[4774]: I1003 15:00:00.264115 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v99s\" (UniqueName: \"kubernetes.io/projected/ba5eed25-71ac-44e3-bf15-daf7b9ac13c6-kube-api-access-8v99s\") pod \"collect-profiles-29325060-lqwfp\" (UID: \"ba5eed25-71ac-44e3-bf15-daf7b9ac13c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-lqwfp" Oct 03 15:00:00 crc kubenswrapper[4774]: I1003 15:00:00.264185 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xf27b\" (UniqueName: \"kubernetes.io/projected/40ae99e2-2da0-4b3c-b1e4-92101379dcbf-kube-api-access-xf27b\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:00 crc kubenswrapper[4774]: I1003 15:00:00.264871 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba5eed25-71ac-44e3-bf15-daf7b9ac13c6-config-volume\") pod \"collect-profiles-29325060-lqwfp\" (UID: \"ba5eed25-71ac-44e3-bf15-daf7b9ac13c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-lqwfp" Oct 03 15:00:00 crc kubenswrapper[4774]: I1003 15:00:00.269452 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ba5eed25-71ac-44e3-bf15-daf7b9ac13c6-secret-volume\") pod \"collect-profiles-29325060-lqwfp\" (UID: \"ba5eed25-71ac-44e3-bf15-daf7b9ac13c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-lqwfp" Oct 03 15:00:00 crc kubenswrapper[4774]: I1003 15:00:00.284959 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v99s\" (UniqueName: \"kubernetes.io/projected/ba5eed25-71ac-44e3-bf15-daf7b9ac13c6-kube-api-access-8v99s\") pod \"collect-profiles-29325060-lqwfp\" (UID: \"ba5eed25-71ac-44e3-bf15-daf7b9ac13c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-lqwfp" Oct 03 15:00:00 crc kubenswrapper[4774]: I1003 15:00:00.450558 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-lqwfp" Oct 03 15:00:00 crc kubenswrapper[4774]: I1003 15:00:00.608356 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-eb1a-account-create-6smf5" Oct 03 15:00:00 crc kubenswrapper[4774]: I1003 15:00:00.608422 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-eb1a-account-create-6smf5" event={"ID":"40ae99e2-2da0-4b3c-b1e4-92101379dcbf","Type":"ContainerDied","Data":"f1e82b0a8f674edee007d6041a6116b189f85481b2f57f508bd43fbee51806f5"} Oct 03 15:00:00 crc kubenswrapper[4774]: I1003 15:00:00.608479 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1e82b0a8f674edee007d6041a6116b189f85481b2f57f508bd43fbee51806f5" Oct 03 15:00:00 crc kubenswrapper[4774]: I1003 15:00:00.866035 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c7c3-account-create-97bj7" Oct 03 15:00:00 crc kubenswrapper[4774]: I1003 15:00:00.873711 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdsfd\" (UniqueName: \"kubernetes.io/projected/1fc94371-a912-4891-a328-1bc2b840fa61-kube-api-access-rdsfd\") pod \"1fc94371-a912-4891-a328-1bc2b840fa61\" (UID: \"1fc94371-a912-4891-a328-1bc2b840fa61\") " Oct 03 15:00:00 crc kubenswrapper[4774]: I1003 15:00:00.883342 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fc94371-a912-4891-a328-1bc2b840fa61-kube-api-access-rdsfd" (OuterVolumeSpecName: "kube-api-access-rdsfd") pod "1fc94371-a912-4891-a328-1bc2b840fa61" (UID: "1fc94371-a912-4891-a328-1bc2b840fa61"). InnerVolumeSpecName "kube-api-access-rdsfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:00:00 crc kubenswrapper[4774]: I1003 15:00:00.928740 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325060-lqwfp"] Oct 03 15:00:00 crc kubenswrapper[4774]: W1003 15:00:00.930752 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba5eed25_71ac_44e3_bf15_daf7b9ac13c6.slice/crio-380f4ce25e2e9cb124c9f0ccd9926b799596284b40b58351442cb9076272062a WatchSource:0}: Error finding container 380f4ce25e2e9cb124c9f0ccd9926b799596284b40b58351442cb9076272062a: Status 404 returned error can't find the container with id 380f4ce25e2e9cb124c9f0ccd9926b799596284b40b58351442cb9076272062a Oct 03 15:00:00 crc kubenswrapper[4774]: I1003 15:00:00.974947 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdsfd\" (UniqueName: \"kubernetes.io/projected/1fc94371-a912-4891-a328-1bc2b840fa61-kube-api-access-rdsfd\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:01 crc kubenswrapper[4774]: I1003 15:00:01.619292 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c7c3-account-create-97bj7" event={"ID":"1fc94371-a912-4891-a328-1bc2b840fa61","Type":"ContainerDied","Data":"26015a48e1f3d14a1182e957cc5fb9064f18ea88655aa2ad6e6e574b6649cdc2"} Oct 03 15:00:01 crc kubenswrapper[4774]: I1003 15:00:01.619657 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26015a48e1f3d14a1182e957cc5fb9064f18ea88655aa2ad6e6e574b6649cdc2" Oct 03 15:00:01 crc kubenswrapper[4774]: I1003 15:00:01.619322 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c7c3-account-create-97bj7" Oct 03 15:00:01 crc kubenswrapper[4774]: I1003 15:00:01.621505 4774 generic.go:334] "Generic (PLEG): container finished" podID="ba5eed25-71ac-44e3-bf15-daf7b9ac13c6" containerID="6280c8b355da60f28b4280f79f5bac30a90c851afd14db16666528c1575c1830" exitCode=0 Oct 03 15:00:01 crc kubenswrapper[4774]: I1003 15:00:01.621544 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-lqwfp" event={"ID":"ba5eed25-71ac-44e3-bf15-daf7b9ac13c6","Type":"ContainerDied","Data":"6280c8b355da60f28b4280f79f5bac30a90c851afd14db16666528c1575c1830"} Oct 03 15:00:01 crc kubenswrapper[4774]: I1003 15:00:01.621572 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-lqwfp" event={"ID":"ba5eed25-71ac-44e3-bf15-daf7b9ac13c6","Type":"ContainerStarted","Data":"380f4ce25e2e9cb124c9f0ccd9926b799596284b40b58351442cb9076272062a"} Oct 03 15:00:01 crc kubenswrapper[4774]: I1003 15:00:01.991296 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bc0b2c39-8c1e-4401-97f6-a4306b435436-etc-swift\") pod \"swift-storage-0\" (UID: \"bc0b2c39-8c1e-4401-97f6-a4306b435436\") " pod="openstack/swift-storage-0" Oct 03 15:00:02 crc kubenswrapper[4774]: I1003 15:00:02.002557 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bc0b2c39-8c1e-4401-97f6-a4306b435436-etc-swift\") pod \"swift-storage-0\" (UID: \"bc0b2c39-8c1e-4401-97f6-a4306b435436\") " pod="openstack/swift-storage-0" Oct 03 15:00:02 crc kubenswrapper[4774]: I1003 15:00:02.023985 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 03 15:00:02 crc kubenswrapper[4774]: I1003 15:00:02.374888 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-k72d4"] Oct 03 15:00:02 crc kubenswrapper[4774]: E1003 15:00:02.375524 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fc94371-a912-4891-a328-1bc2b840fa61" containerName="mariadb-account-create" Oct 03 15:00:02 crc kubenswrapper[4774]: I1003 15:00:02.375540 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc94371-a912-4891-a328-1bc2b840fa61" containerName="mariadb-account-create" Oct 03 15:00:02 crc kubenswrapper[4774]: I1003 15:00:02.375684 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fc94371-a912-4891-a328-1bc2b840fa61" containerName="mariadb-account-create" Oct 03 15:00:02 crc kubenswrapper[4774]: I1003 15:00:02.376220 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-k72d4" Oct 03 15:00:02 crc kubenswrapper[4774]: I1003 15:00:02.378193 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-q7vtm" Oct 03 15:00:02 crc kubenswrapper[4774]: I1003 15:00:02.378430 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 03 15:00:02 crc kubenswrapper[4774]: I1003 15:00:02.386442 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-k72d4"] Oct 03 15:00:02 crc kubenswrapper[4774]: I1003 15:00:02.500114 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c2fdc49-a155-4a5b-afce-314af82e3f5b-combined-ca-bundle\") pod \"glance-db-sync-k72d4\" (UID: \"6c2fdc49-a155-4a5b-afce-314af82e3f5b\") " pod="openstack/glance-db-sync-k72d4" Oct 03 15:00:02 crc kubenswrapper[4774]: I1003 15:00:02.500204 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6c2fdc49-a155-4a5b-afce-314af82e3f5b-db-sync-config-data\") pod \"glance-db-sync-k72d4\" (UID: \"6c2fdc49-a155-4a5b-afce-314af82e3f5b\") " pod="openstack/glance-db-sync-k72d4" Oct 03 15:00:02 crc kubenswrapper[4774]: I1003 15:00:02.500305 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c2fdc49-a155-4a5b-afce-314af82e3f5b-config-data\") pod \"glance-db-sync-k72d4\" (UID: \"6c2fdc49-a155-4a5b-afce-314af82e3f5b\") " pod="openstack/glance-db-sync-k72d4" Oct 03 15:00:02 crc kubenswrapper[4774]: I1003 15:00:02.500464 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmhlw\" (UniqueName: \"kubernetes.io/projected/6c2fdc49-a155-4a5b-afce-314af82e3f5b-kube-api-access-wmhlw\") pod \"glance-db-sync-k72d4\" (UID: \"6c2fdc49-a155-4a5b-afce-314af82e3f5b\") " pod="openstack/glance-db-sync-k72d4" Oct 03 15:00:02 crc kubenswrapper[4774]: I1003 15:00:02.555128 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 03 15:00:02 crc kubenswrapper[4774]: I1003 15:00:02.601329 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c2fdc49-a155-4a5b-afce-314af82e3f5b-config-data\") pod \"glance-db-sync-k72d4\" (UID: \"6c2fdc49-a155-4a5b-afce-314af82e3f5b\") " pod="openstack/glance-db-sync-k72d4" Oct 03 15:00:02 crc kubenswrapper[4774]: I1003 15:00:02.601939 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmhlw\" (UniqueName: \"kubernetes.io/projected/6c2fdc49-a155-4a5b-afce-314af82e3f5b-kube-api-access-wmhlw\") pod \"glance-db-sync-k72d4\" (UID: \"6c2fdc49-a155-4a5b-afce-314af82e3f5b\") " pod="openstack/glance-db-sync-k72d4" Oct 03 15:00:02 crc kubenswrapper[4774]: I1003 15:00:02.602086 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c2fdc49-a155-4a5b-afce-314af82e3f5b-combined-ca-bundle\") pod \"glance-db-sync-k72d4\" (UID: \"6c2fdc49-a155-4a5b-afce-314af82e3f5b\") " pod="openstack/glance-db-sync-k72d4" Oct 03 15:00:02 crc kubenswrapper[4774]: I1003 15:00:02.602217 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6c2fdc49-a155-4a5b-afce-314af82e3f5b-db-sync-config-data\") pod \"glance-db-sync-k72d4\" (UID: \"6c2fdc49-a155-4a5b-afce-314af82e3f5b\") " pod="openstack/glance-db-sync-k72d4" Oct 03 15:00:02 crc kubenswrapper[4774]: I1003 15:00:02.607423 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c2fdc49-a155-4a5b-afce-314af82e3f5b-combined-ca-bundle\") pod \"glance-db-sync-k72d4\" (UID: \"6c2fdc49-a155-4a5b-afce-314af82e3f5b\") " pod="openstack/glance-db-sync-k72d4" Oct 03 15:00:02 crc kubenswrapper[4774]: I1003 15:00:02.607461 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c2fdc49-a155-4a5b-afce-314af82e3f5b-config-data\") pod \"glance-db-sync-k72d4\" (UID: \"6c2fdc49-a155-4a5b-afce-314af82e3f5b\") " pod="openstack/glance-db-sync-k72d4" Oct 03 15:00:02 crc kubenswrapper[4774]: I1003 15:00:02.608347 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6c2fdc49-a155-4a5b-afce-314af82e3f5b-db-sync-config-data\") pod \"glance-db-sync-k72d4\" (UID: \"6c2fdc49-a155-4a5b-afce-314af82e3f5b\") " pod="openstack/glance-db-sync-k72d4" Oct 03 15:00:02 crc kubenswrapper[4774]: I1003 15:00:02.626058 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmhlw\" (UniqueName: \"kubernetes.io/projected/6c2fdc49-a155-4a5b-afce-314af82e3f5b-kube-api-access-wmhlw\") pod \"glance-db-sync-k72d4\" (UID: \"6c2fdc49-a155-4a5b-afce-314af82e3f5b\") " pod="openstack/glance-db-sync-k72d4" Oct 03 15:00:02 crc kubenswrapper[4774]: I1003 15:00:02.629579 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc0b2c39-8c1e-4401-97f6-a4306b435436","Type":"ContainerStarted","Data":"d9ac001e5fb2255cf3eb1d880a02e4915e56d85669d368e424cfd00a19a04e4b"} Oct 03 15:00:02 crc kubenswrapper[4774]: I1003 15:00:02.631838 4774 generic.go:334] "Generic (PLEG): container finished" podID="7fa97e79-a30c-4722-b02b-ec5494bd057c" containerID="ba4f05d232c8d413350b06f66392f7b2e2403d6d44ba76999902b2b487809df3" exitCode=0 Oct 03 15:00:02 crc kubenswrapper[4774]: I1003 15:00:02.631902 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7fa97e79-a30c-4722-b02b-ec5494bd057c","Type":"ContainerDied","Data":"ba4f05d232c8d413350b06f66392f7b2e2403d6d44ba76999902b2b487809df3"} Oct 03 15:00:02 crc kubenswrapper[4774]: I1003 15:00:02.633832 4774 generic.go:334] "Generic (PLEG): container finished" podID="0a0a516a-bd97-4484-802b-71eb14f3ca3f" containerID="bf3375d50c95a1b4c96eaea4067faf1aa3e5e031209fd4b5cd965560980c24ae" exitCode=0 Oct 03 15:00:02 crc kubenswrapper[4774]: I1003 15:00:02.634011 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0a0a516a-bd97-4484-802b-71eb14f3ca3f","Type":"ContainerDied","Data":"bf3375d50c95a1b4c96eaea4067faf1aa3e5e031209fd4b5cd965560980c24ae"} Oct 03 15:00:02 crc kubenswrapper[4774]: I1003 15:00:02.732939 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-k72d4" Oct 03 15:00:02 crc kubenswrapper[4774]: I1003 15:00:02.949895 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-lqwfp" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.110325 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba5eed25-71ac-44e3-bf15-daf7b9ac13c6-config-volume\") pod \"ba5eed25-71ac-44e3-bf15-daf7b9ac13c6\" (UID: \"ba5eed25-71ac-44e3-bf15-daf7b9ac13c6\") " Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.110675 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ba5eed25-71ac-44e3-bf15-daf7b9ac13c6-secret-volume\") pod \"ba5eed25-71ac-44e3-bf15-daf7b9ac13c6\" (UID: \"ba5eed25-71ac-44e3-bf15-daf7b9ac13c6\") " Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.110834 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v99s\" (UniqueName: \"kubernetes.io/projected/ba5eed25-71ac-44e3-bf15-daf7b9ac13c6-kube-api-access-8v99s\") pod \"ba5eed25-71ac-44e3-bf15-daf7b9ac13c6\" (UID: \"ba5eed25-71ac-44e3-bf15-daf7b9ac13c6\") " Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.112698 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba5eed25-71ac-44e3-bf15-daf7b9ac13c6-config-volume" (OuterVolumeSpecName: "config-volume") pod "ba5eed25-71ac-44e3-bf15-daf7b9ac13c6" (UID: "ba5eed25-71ac-44e3-bf15-daf7b9ac13c6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.117310 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba5eed25-71ac-44e3-bf15-daf7b9ac13c6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ba5eed25-71ac-44e3-bf15-daf7b9ac13c6" (UID: "ba5eed25-71ac-44e3-bf15-daf7b9ac13c6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.133843 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba5eed25-71ac-44e3-bf15-daf7b9ac13c6-kube-api-access-8v99s" (OuterVolumeSpecName: "kube-api-access-8v99s") pod "ba5eed25-71ac-44e3-bf15-daf7b9ac13c6" (UID: "ba5eed25-71ac-44e3-bf15-daf7b9ac13c6"). InnerVolumeSpecName "kube-api-access-8v99s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.212750 4774 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba5eed25-71ac-44e3-bf15-daf7b9ac13c6-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.212781 4774 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ba5eed25-71ac-44e3-bf15-daf7b9ac13c6-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.212791 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8v99s\" (UniqueName: \"kubernetes.io/projected/ba5eed25-71ac-44e3-bf15-daf7b9ac13c6-kube-api-access-8v99s\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.301077 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-4wgb7" podUID="b9111154-59d2-4b07-b8c3-db1870883cde" containerName="ovn-controller" probeResult="failure" output=< Oct 03 15:00:03 crc kubenswrapper[4774]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 03 15:00:03 crc kubenswrapper[4774]: > Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.310532 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-bhmcl" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.326575 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-bhmcl" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.350665 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-k72d4"] Oct 03 15:00:03 crc kubenswrapper[4774]: W1003 15:00:03.358617 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c2fdc49_a155_4a5b_afce_314af82e3f5b.slice/crio-544d3a3f81f69935528a83f365ce7c5a078a316509b6924f416e1dbd7109af4e WatchSource:0}: Error finding container 544d3a3f81f69935528a83f365ce7c5a078a316509b6924f416e1dbd7109af4e: Status 404 returned error can't find the container with id 544d3a3f81f69935528a83f365ce7c5a078a316509b6924f416e1dbd7109af4e Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.556978 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-4wgb7-config-4gt92"] Oct 03 15:00:03 crc kubenswrapper[4774]: E1003 15:00:03.557408 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba5eed25-71ac-44e3-bf15-daf7b9ac13c6" containerName="collect-profiles" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.557428 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba5eed25-71ac-44e3-bf15-daf7b9ac13c6" containerName="collect-profiles" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.557666 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba5eed25-71ac-44e3-bf15-daf7b9ac13c6" containerName="collect-profiles" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.558289 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4wgb7-config-4gt92" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.562730 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.572416 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-4wgb7-config-4gt92"] Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.641729 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7fa97e79-a30c-4722-b02b-ec5494bd057c","Type":"ContainerStarted","Data":"b9c1f78fe703fc2ec1c54dca1c5970e4e1d1cd786f37bb24fddfa058335a6a97"} Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.641953 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.643315 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0a0a516a-bd97-4484-802b-71eb14f3ca3f","Type":"ContainerStarted","Data":"4f460fb22aff9e6a313dab120e473cd831b1efea3d93c5bf04659fe9ff70d964"} Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.643527 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.644694 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-lqwfp" event={"ID":"ba5eed25-71ac-44e3-bf15-daf7b9ac13c6","Type":"ContainerDied","Data":"380f4ce25e2e9cb124c9f0ccd9926b799596284b40b58351442cb9076272062a"} Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.644720 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="380f4ce25e2e9cb124c9f0ccd9926b799596284b40b58351442cb9076272062a" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.644726 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-lqwfp" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.646386 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-k72d4" event={"ID":"6c2fdc49-a155-4a5b-afce-314af82e3f5b","Type":"ContainerStarted","Data":"544d3a3f81f69935528a83f365ce7c5a078a316509b6924f416e1dbd7109af4e"} Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.666518 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.010782233 podStartE2EDuration="1m11.666501529s" podCreationTimestamp="2025-10-03 14:58:52 +0000 UTC" firstStartedPulling="2025-10-03 14:58:54.628675115 +0000 UTC m=+957.217878567" lastFinishedPulling="2025-10-03 14:59:29.284394411 +0000 UTC m=+991.873597863" observedRunningTime="2025-10-03 15:00:03.660969581 +0000 UTC m=+1026.250173033" watchObservedRunningTime="2025-10-03 15:00:03.666501529 +0000 UTC m=+1026.255704981" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.683204 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.623428484 podStartE2EDuration="1m11.683190005s" podCreationTimestamp="2025-10-03 14:58:52 +0000 UTC" firstStartedPulling="2025-10-03 14:58:54.221624865 +0000 UTC m=+956.810828317" lastFinishedPulling="2025-10-03 14:59:29.281386386 +0000 UTC m=+991.870589838" observedRunningTime="2025-10-03 15:00:03.68057781 +0000 UTC m=+1026.269781262" watchObservedRunningTime="2025-10-03 15:00:03.683190005 +0000 UTC m=+1026.272393457" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.718880 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c2acd0b1-f812-458d-8255-2747c7160a20-var-log-ovn\") pod \"ovn-controller-4wgb7-config-4gt92\" (UID: \"c2acd0b1-f812-458d-8255-2747c7160a20\") " pod="openstack/ovn-controller-4wgb7-config-4gt92" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.719588 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2acd0b1-f812-458d-8255-2747c7160a20-scripts\") pod \"ovn-controller-4wgb7-config-4gt92\" (UID: \"c2acd0b1-f812-458d-8255-2747c7160a20\") " pod="openstack/ovn-controller-4wgb7-config-4gt92" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.719927 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6r5g\" (UniqueName: \"kubernetes.io/projected/c2acd0b1-f812-458d-8255-2747c7160a20-kube-api-access-v6r5g\") pod \"ovn-controller-4wgb7-config-4gt92\" (UID: \"c2acd0b1-f812-458d-8255-2747c7160a20\") " pod="openstack/ovn-controller-4wgb7-config-4gt92" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.720136 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c2acd0b1-f812-458d-8255-2747c7160a20-var-run-ovn\") pod \"ovn-controller-4wgb7-config-4gt92\" (UID: \"c2acd0b1-f812-458d-8255-2747c7160a20\") " pod="openstack/ovn-controller-4wgb7-config-4gt92" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.720572 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c2acd0b1-f812-458d-8255-2747c7160a20-var-run\") pod \"ovn-controller-4wgb7-config-4gt92\" (UID: \"c2acd0b1-f812-458d-8255-2747c7160a20\") " pod="openstack/ovn-controller-4wgb7-config-4gt92" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.721364 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c2acd0b1-f812-458d-8255-2747c7160a20-additional-scripts\") pod \"ovn-controller-4wgb7-config-4gt92\" (UID: \"c2acd0b1-f812-458d-8255-2747c7160a20\") " pod="openstack/ovn-controller-4wgb7-config-4gt92" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.822557 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c2acd0b1-f812-458d-8255-2747c7160a20-var-log-ovn\") pod \"ovn-controller-4wgb7-config-4gt92\" (UID: \"c2acd0b1-f812-458d-8255-2747c7160a20\") " pod="openstack/ovn-controller-4wgb7-config-4gt92" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.822609 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2acd0b1-f812-458d-8255-2747c7160a20-scripts\") pod \"ovn-controller-4wgb7-config-4gt92\" (UID: \"c2acd0b1-f812-458d-8255-2747c7160a20\") " pod="openstack/ovn-controller-4wgb7-config-4gt92" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.822639 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6r5g\" (UniqueName: \"kubernetes.io/projected/c2acd0b1-f812-458d-8255-2747c7160a20-kube-api-access-v6r5g\") pod \"ovn-controller-4wgb7-config-4gt92\" (UID: \"c2acd0b1-f812-458d-8255-2747c7160a20\") " pod="openstack/ovn-controller-4wgb7-config-4gt92" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.822679 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c2acd0b1-f812-458d-8255-2747c7160a20-var-run-ovn\") pod \"ovn-controller-4wgb7-config-4gt92\" (UID: \"c2acd0b1-f812-458d-8255-2747c7160a20\") " pod="openstack/ovn-controller-4wgb7-config-4gt92" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.822710 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c2acd0b1-f812-458d-8255-2747c7160a20-var-run\") pod \"ovn-controller-4wgb7-config-4gt92\" (UID: \"c2acd0b1-f812-458d-8255-2747c7160a20\") " pod="openstack/ovn-controller-4wgb7-config-4gt92" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.822744 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c2acd0b1-f812-458d-8255-2747c7160a20-additional-scripts\") pod \"ovn-controller-4wgb7-config-4gt92\" (UID: \"c2acd0b1-f812-458d-8255-2747c7160a20\") " pod="openstack/ovn-controller-4wgb7-config-4gt92" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.823061 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c2acd0b1-f812-458d-8255-2747c7160a20-var-log-ovn\") pod \"ovn-controller-4wgb7-config-4gt92\" (UID: \"c2acd0b1-f812-458d-8255-2747c7160a20\") " pod="openstack/ovn-controller-4wgb7-config-4gt92" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.823136 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c2acd0b1-f812-458d-8255-2747c7160a20-var-run-ovn\") pod \"ovn-controller-4wgb7-config-4gt92\" (UID: \"c2acd0b1-f812-458d-8255-2747c7160a20\") " pod="openstack/ovn-controller-4wgb7-config-4gt92" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.823173 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c2acd0b1-f812-458d-8255-2747c7160a20-var-run\") pod \"ovn-controller-4wgb7-config-4gt92\" (UID: \"c2acd0b1-f812-458d-8255-2747c7160a20\") " pod="openstack/ovn-controller-4wgb7-config-4gt92" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.823537 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c2acd0b1-f812-458d-8255-2747c7160a20-additional-scripts\") pod \"ovn-controller-4wgb7-config-4gt92\" (UID: \"c2acd0b1-f812-458d-8255-2747c7160a20\") " pod="openstack/ovn-controller-4wgb7-config-4gt92" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.826152 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2acd0b1-f812-458d-8255-2747c7160a20-scripts\") pod \"ovn-controller-4wgb7-config-4gt92\" (UID: \"c2acd0b1-f812-458d-8255-2747c7160a20\") " pod="openstack/ovn-controller-4wgb7-config-4gt92" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.846298 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6r5g\" (UniqueName: \"kubernetes.io/projected/c2acd0b1-f812-458d-8255-2747c7160a20-kube-api-access-v6r5g\") pod \"ovn-controller-4wgb7-config-4gt92\" (UID: \"c2acd0b1-f812-458d-8255-2747c7160a20\") " pod="openstack/ovn-controller-4wgb7-config-4gt92" Oct 03 15:00:03 crc kubenswrapper[4774]: I1003 15:00:03.929278 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4wgb7-config-4gt92" Oct 03 15:00:04 crc kubenswrapper[4774]: I1003 15:00:04.404636 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-4wgb7-config-4gt92"] Oct 03 15:00:04 crc kubenswrapper[4774]: W1003 15:00:04.615191 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2acd0b1_f812_458d_8255_2747c7160a20.slice/crio-a576a5b3e35e0a17e5e7e62f4c0273bd7cea41aa4210699419378fef61277c1e WatchSource:0}: Error finding container a576a5b3e35e0a17e5e7e62f4c0273bd7cea41aa4210699419378fef61277c1e: Status 404 returned error can't find the container with id a576a5b3e35e0a17e5e7e62f4c0273bd7cea41aa4210699419378fef61277c1e Oct 03 15:00:04 crc kubenswrapper[4774]: I1003 15:00:04.682279 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4wgb7-config-4gt92" event={"ID":"c2acd0b1-f812-458d-8255-2747c7160a20","Type":"ContainerStarted","Data":"a576a5b3e35e0a17e5e7e62f4c0273bd7cea41aa4210699419378fef61277c1e"} Oct 03 15:00:05 crc kubenswrapper[4774]: I1003 15:00:05.691484 4774 generic.go:334] "Generic (PLEG): container finished" podID="c2acd0b1-f812-458d-8255-2747c7160a20" containerID="cd3aaf95f01cd2b989fcb7c39289b44ea75ae137d4ce03872d6ecbab64985142" exitCode=0 Oct 03 15:00:05 crc kubenswrapper[4774]: I1003 15:00:05.691654 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4wgb7-config-4gt92" event={"ID":"c2acd0b1-f812-458d-8255-2747c7160a20","Type":"ContainerDied","Data":"cd3aaf95f01cd2b989fcb7c39289b44ea75ae137d4ce03872d6ecbab64985142"} Oct 03 15:00:05 crc kubenswrapper[4774]: I1003 15:00:05.696426 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc0b2c39-8c1e-4401-97f6-a4306b435436","Type":"ContainerStarted","Data":"a6d9a0a22a5475c99d9e07ffdde6af6cd104f85abda5ffd2cde432e1db383901"} Oct 03 15:00:05 crc kubenswrapper[4774]: I1003 15:00:05.696468 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc0b2c39-8c1e-4401-97f6-a4306b435436","Type":"ContainerStarted","Data":"81a7fce532a0b1a05d0dcade42b5400985437a53f9a65729f278534d0acd85e9"} Oct 03 15:00:05 crc kubenswrapper[4774]: I1003 15:00:05.696492 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc0b2c39-8c1e-4401-97f6-a4306b435436","Type":"ContainerStarted","Data":"5719cd1577cebe42e42c0913d85516d8d566559f209337fe67698d8c20ba1544"} Oct 03 15:00:05 crc kubenswrapper[4774]: I1003 15:00:05.696505 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc0b2c39-8c1e-4401-97f6-a4306b435436","Type":"ContainerStarted","Data":"d4efd0ce80471dac47907064c45cf6aae765de9e701ca230bca09262ad568ef8"} Oct 03 15:00:07 crc kubenswrapper[4774]: I1003 15:00:07.052321 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-418c-account-create-6xbkh"] Oct 03 15:00:07 crc kubenswrapper[4774]: I1003 15:00:07.053957 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-418c-account-create-6xbkh" Oct 03 15:00:07 crc kubenswrapper[4774]: I1003 15:00:07.056101 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 03 15:00:07 crc kubenswrapper[4774]: I1003 15:00:07.061209 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-418c-account-create-6xbkh"] Oct 03 15:00:07 crc kubenswrapper[4774]: I1003 15:00:07.160931 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4wgb7-config-4gt92" Oct 03 15:00:07 crc kubenswrapper[4774]: I1003 15:00:07.186886 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99r8c\" (UniqueName: \"kubernetes.io/projected/eff4b471-cbe1-458d-b00a-9ab69a909afc-kube-api-access-99r8c\") pod \"placement-418c-account-create-6xbkh\" (UID: \"eff4b471-cbe1-458d-b00a-9ab69a909afc\") " pod="openstack/placement-418c-account-create-6xbkh" Oct 03 15:00:07 crc kubenswrapper[4774]: I1003 15:00:07.288616 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c2acd0b1-f812-458d-8255-2747c7160a20-var-run\") pod \"c2acd0b1-f812-458d-8255-2747c7160a20\" (UID: \"c2acd0b1-f812-458d-8255-2747c7160a20\") " Oct 03 15:00:07 crc kubenswrapper[4774]: I1003 15:00:07.288692 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6r5g\" (UniqueName: \"kubernetes.io/projected/c2acd0b1-f812-458d-8255-2747c7160a20-kube-api-access-v6r5g\") pod \"c2acd0b1-f812-458d-8255-2747c7160a20\" (UID: \"c2acd0b1-f812-458d-8255-2747c7160a20\") " Oct 03 15:00:07 crc kubenswrapper[4774]: I1003 15:00:07.288758 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c2acd0b1-f812-458d-8255-2747c7160a20-var-run-ovn\") pod \"c2acd0b1-f812-458d-8255-2747c7160a20\" (UID: \"c2acd0b1-f812-458d-8255-2747c7160a20\") " Oct 03 15:00:07 crc kubenswrapper[4774]: I1003 15:00:07.288792 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2acd0b1-f812-458d-8255-2747c7160a20-scripts\") pod \"c2acd0b1-f812-458d-8255-2747c7160a20\" (UID: \"c2acd0b1-f812-458d-8255-2747c7160a20\") " Oct 03 15:00:07 crc kubenswrapper[4774]: I1003 15:00:07.288798 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2acd0b1-f812-458d-8255-2747c7160a20-var-run" (OuterVolumeSpecName: "var-run") pod "c2acd0b1-f812-458d-8255-2747c7160a20" (UID: "c2acd0b1-f812-458d-8255-2747c7160a20"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 15:00:07 crc kubenswrapper[4774]: I1003 15:00:07.288885 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c2acd0b1-f812-458d-8255-2747c7160a20-var-log-ovn\") pod \"c2acd0b1-f812-458d-8255-2747c7160a20\" (UID: \"c2acd0b1-f812-458d-8255-2747c7160a20\") " Oct 03 15:00:07 crc kubenswrapper[4774]: I1003 15:00:07.288905 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c2acd0b1-f812-458d-8255-2747c7160a20-additional-scripts\") pod \"c2acd0b1-f812-458d-8255-2747c7160a20\" (UID: \"c2acd0b1-f812-458d-8255-2747c7160a20\") " Oct 03 15:00:07 crc kubenswrapper[4774]: I1003 15:00:07.288842 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2acd0b1-f812-458d-8255-2747c7160a20-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c2acd0b1-f812-458d-8255-2747c7160a20" (UID: "c2acd0b1-f812-458d-8255-2747c7160a20"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 15:00:07 crc kubenswrapper[4774]: I1003 15:00:07.288901 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2acd0b1-f812-458d-8255-2747c7160a20-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c2acd0b1-f812-458d-8255-2747c7160a20" (UID: "c2acd0b1-f812-458d-8255-2747c7160a20"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 15:00:07 crc kubenswrapper[4774]: I1003 15:00:07.289781 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2acd0b1-f812-458d-8255-2747c7160a20-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "c2acd0b1-f812-458d-8255-2747c7160a20" (UID: "c2acd0b1-f812-458d-8255-2747c7160a20"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:00:07 crc kubenswrapper[4774]: I1003 15:00:07.290088 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99r8c\" (UniqueName: \"kubernetes.io/projected/eff4b471-cbe1-458d-b00a-9ab69a909afc-kube-api-access-99r8c\") pod \"placement-418c-account-create-6xbkh\" (UID: \"eff4b471-cbe1-458d-b00a-9ab69a909afc\") " pod="openstack/placement-418c-account-create-6xbkh" Oct 03 15:00:07 crc kubenswrapper[4774]: I1003 15:00:07.290341 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2acd0b1-f812-458d-8255-2747c7160a20-scripts" (OuterVolumeSpecName: "scripts") pod "c2acd0b1-f812-458d-8255-2747c7160a20" (UID: "c2acd0b1-f812-458d-8255-2747c7160a20"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:00:07 crc kubenswrapper[4774]: I1003 15:00:07.290749 4774 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c2acd0b1-f812-458d-8255-2747c7160a20-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:07 crc kubenswrapper[4774]: I1003 15:00:07.290770 4774 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c2acd0b1-f812-458d-8255-2747c7160a20-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:07 crc kubenswrapper[4774]: I1003 15:00:07.290783 4774 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c2acd0b1-f812-458d-8255-2747c7160a20-var-run\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:07 crc kubenswrapper[4774]: I1003 15:00:07.290797 4774 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c2acd0b1-f812-458d-8255-2747c7160a20-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:07 crc kubenswrapper[4774]: I1003 15:00:07.290807 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2acd0b1-f812-458d-8255-2747c7160a20-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:07 crc kubenswrapper[4774]: I1003 15:00:07.306827 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2acd0b1-f812-458d-8255-2747c7160a20-kube-api-access-v6r5g" (OuterVolumeSpecName: "kube-api-access-v6r5g") pod "c2acd0b1-f812-458d-8255-2747c7160a20" (UID: "c2acd0b1-f812-458d-8255-2747c7160a20"). InnerVolumeSpecName "kube-api-access-v6r5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:00:07 crc kubenswrapper[4774]: I1003 15:00:07.310884 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99r8c\" (UniqueName: \"kubernetes.io/projected/eff4b471-cbe1-458d-b00a-9ab69a909afc-kube-api-access-99r8c\") pod \"placement-418c-account-create-6xbkh\" (UID: \"eff4b471-cbe1-458d-b00a-9ab69a909afc\") " pod="openstack/placement-418c-account-create-6xbkh" Oct 03 15:00:07 crc kubenswrapper[4774]: I1003 15:00:07.392095 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6r5g\" (UniqueName: \"kubernetes.io/projected/c2acd0b1-f812-458d-8255-2747c7160a20-kube-api-access-v6r5g\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:07 crc kubenswrapper[4774]: I1003 15:00:07.458662 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-418c-account-create-6xbkh" Oct 03 15:00:07 crc kubenswrapper[4774]: I1003 15:00:07.726172 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4wgb7-config-4gt92" event={"ID":"c2acd0b1-f812-458d-8255-2747c7160a20","Type":"ContainerDied","Data":"a576a5b3e35e0a17e5e7e62f4c0273bd7cea41aa4210699419378fef61277c1e"} Oct 03 15:00:07 crc kubenswrapper[4774]: I1003 15:00:07.726512 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a576a5b3e35e0a17e5e7e62f4c0273bd7cea41aa4210699419378fef61277c1e" Oct 03 15:00:07 crc kubenswrapper[4774]: I1003 15:00:07.726570 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4wgb7-config-4gt92" Oct 03 15:00:07 crc kubenswrapper[4774]: I1003 15:00:07.738272 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-418c-account-create-6xbkh"] Oct 03 15:00:07 crc kubenswrapper[4774]: I1003 15:00:07.738768 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc0b2c39-8c1e-4401-97f6-a4306b435436","Type":"ContainerStarted","Data":"812659e8613fb170717ee811f161a962483729b33731577982e925a3e540f6b0"} Oct 03 15:00:07 crc kubenswrapper[4774]: I1003 15:00:07.738805 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc0b2c39-8c1e-4401-97f6-a4306b435436","Type":"ContainerStarted","Data":"180b91d719cba8895acb24cd11c5aef5c5d5a8c6da7a7ff03d4db44361443405"} Oct 03 15:00:07 crc kubenswrapper[4774]: I1003 15:00:07.738814 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc0b2c39-8c1e-4401-97f6-a4306b435436","Type":"ContainerStarted","Data":"fa6cfedbb5ac4115ef8849febbbc3b2420339ba2f70004ec58045b278982e2e2"} Oct 03 15:00:07 crc kubenswrapper[4774]: W1003 15:00:07.750156 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeff4b471_cbe1_458d_b00a_9ab69a909afc.slice/crio-20943697d99673a1b9dd9d9f1b2dbccce24e43c872629aa9ef4ea5c3b71ee0bb WatchSource:0}: Error finding container 20943697d99673a1b9dd9d9f1b2dbccce24e43c872629aa9ef4ea5c3b71ee0bb: Status 404 returned error can't find the container with id 20943697d99673a1b9dd9d9f1b2dbccce24e43c872629aa9ef4ea5c3b71ee0bb Oct 03 15:00:08 crc kubenswrapper[4774]: I1003 15:00:08.315663 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-4wgb7-config-4gt92"] Oct 03 15:00:08 crc kubenswrapper[4774]: I1003 15:00:08.323284 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-4wgb7-config-4gt92"] Oct 03 15:00:08 crc kubenswrapper[4774]: I1003 15:00:08.333666 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-4wgb7" Oct 03 15:00:08 crc kubenswrapper[4774]: I1003 15:00:08.792638 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc0b2c39-8c1e-4401-97f6-a4306b435436","Type":"ContainerStarted","Data":"0150e3ccf695e7a49820a35fd07f105f769f85a22d63f0dc11e9809b671b163d"} Oct 03 15:00:08 crc kubenswrapper[4774]: I1003 15:00:08.809115 4774 generic.go:334] "Generic (PLEG): container finished" podID="eff4b471-cbe1-458d-b00a-9ab69a909afc" containerID="e5a9291ddc286e46dade07b7c1330d3089b9bfff3d2dffbf26f97751487d00a0" exitCode=0 Oct 03 15:00:08 crc kubenswrapper[4774]: I1003 15:00:08.809169 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-418c-account-create-6xbkh" event={"ID":"eff4b471-cbe1-458d-b00a-9ab69a909afc","Type":"ContainerDied","Data":"e5a9291ddc286e46dade07b7c1330d3089b9bfff3d2dffbf26f97751487d00a0"} Oct 03 15:00:08 crc kubenswrapper[4774]: I1003 15:00:08.809201 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-418c-account-create-6xbkh" event={"ID":"eff4b471-cbe1-458d-b00a-9ab69a909afc","Type":"ContainerStarted","Data":"20943697d99673a1b9dd9d9f1b2dbccce24e43c872629aa9ef4ea5c3b71ee0bb"} Oct 03 15:00:09 crc kubenswrapper[4774]: I1003 15:00:09.313500 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2acd0b1-f812-458d-8255-2747c7160a20" path="/var/lib/kubelet/pods/c2acd0b1-f812-458d-8255-2747c7160a20/volumes" Oct 03 15:00:09 crc kubenswrapper[4774]: I1003 15:00:09.821193 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc0b2c39-8c1e-4401-97f6-a4306b435436","Type":"ContainerStarted","Data":"3ccdaf94a9f0b8283a67b7f64bbfade9c27d874b394c5bf433663b78a42f0261"} Oct 03 15:00:13 crc kubenswrapper[4774]: I1003 15:00:13.798891 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:00:13 crc kubenswrapper[4774]: I1003 15:00:13.970621 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 03 15:00:15 crc kubenswrapper[4774]: I1003 15:00:15.575628 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-zhgsz"] Oct 03 15:00:15 crc kubenswrapper[4774]: E1003 15:00:15.575995 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2acd0b1-f812-458d-8255-2747c7160a20" containerName="ovn-config" Oct 03 15:00:15 crc kubenswrapper[4774]: I1003 15:00:15.576010 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2acd0b1-f812-458d-8255-2747c7160a20" containerName="ovn-config" Oct 03 15:00:15 crc kubenswrapper[4774]: I1003 15:00:15.576195 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2acd0b1-f812-458d-8255-2747c7160a20" containerName="ovn-config" Oct 03 15:00:15 crc kubenswrapper[4774]: I1003 15:00:15.576755 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-zhgsz" Oct 03 15:00:15 crc kubenswrapper[4774]: I1003 15:00:15.592648 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-zhgsz"] Oct 03 15:00:15 crc kubenswrapper[4774]: I1003 15:00:15.660814 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-4t58x"] Oct 03 15:00:15 crc kubenswrapper[4774]: I1003 15:00:15.663798 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4t58x" Oct 03 15:00:15 crc kubenswrapper[4774]: I1003 15:00:15.680715 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cplt2\" (UniqueName: \"kubernetes.io/projected/bb7d18d2-5f77-4f17-8fce-7ea5b663a23c-kube-api-access-cplt2\") pod \"cinder-db-create-zhgsz\" (UID: \"bb7d18d2-5f77-4f17-8fce-7ea5b663a23c\") " pod="openstack/cinder-db-create-zhgsz" Oct 03 15:00:15 crc kubenswrapper[4774]: I1003 15:00:15.682834 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-4t58x"] Oct 03 15:00:15 crc kubenswrapper[4774]: I1003 15:00:15.782670 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cplt2\" (UniqueName: \"kubernetes.io/projected/bb7d18d2-5f77-4f17-8fce-7ea5b663a23c-kube-api-access-cplt2\") pod \"cinder-db-create-zhgsz\" (UID: \"bb7d18d2-5f77-4f17-8fce-7ea5b663a23c\") " pod="openstack/cinder-db-create-zhgsz" Oct 03 15:00:15 crc kubenswrapper[4774]: I1003 15:00:15.782801 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfjqx\" (UniqueName: \"kubernetes.io/projected/b5dad4cd-6b95-4b92-804a-c94135bff5ae-kube-api-access-wfjqx\") pod \"barbican-db-create-4t58x\" (UID: \"b5dad4cd-6b95-4b92-804a-c94135bff5ae\") " pod="openstack/barbican-db-create-4t58x" Oct 03 15:00:15 crc kubenswrapper[4774]: I1003 15:00:15.805259 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cplt2\" (UniqueName: \"kubernetes.io/projected/bb7d18d2-5f77-4f17-8fce-7ea5b663a23c-kube-api-access-cplt2\") pod \"cinder-db-create-zhgsz\" (UID: \"bb7d18d2-5f77-4f17-8fce-7ea5b663a23c\") " pod="openstack/cinder-db-create-zhgsz" Oct 03 15:00:15 crc kubenswrapper[4774]: I1003 15:00:15.864185 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-7qrp9"] Oct 03 15:00:15 crc kubenswrapper[4774]: I1003 15:00:15.865199 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7qrp9" Oct 03 15:00:15 crc kubenswrapper[4774]: I1003 15:00:15.878382 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-7qrp9"] Oct 03 15:00:15 crc kubenswrapper[4774]: I1003 15:00:15.884344 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfjqx\" (UniqueName: \"kubernetes.io/projected/b5dad4cd-6b95-4b92-804a-c94135bff5ae-kube-api-access-wfjqx\") pod \"barbican-db-create-4t58x\" (UID: \"b5dad4cd-6b95-4b92-804a-c94135bff5ae\") " pod="openstack/barbican-db-create-4t58x" Oct 03 15:00:15 crc kubenswrapper[4774]: I1003 15:00:15.902788 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-zhgsz" Oct 03 15:00:15 crc kubenswrapper[4774]: I1003 15:00:15.930527 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfjqx\" (UniqueName: \"kubernetes.io/projected/b5dad4cd-6b95-4b92-804a-c94135bff5ae-kube-api-access-wfjqx\") pod \"barbican-db-create-4t58x\" (UID: \"b5dad4cd-6b95-4b92-804a-c94135bff5ae\") " pod="openstack/barbican-db-create-4t58x" Oct 03 15:00:15 crc kubenswrapper[4774]: I1003 15:00:15.941770 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-x6k8t"] Oct 03 15:00:15 crc kubenswrapper[4774]: I1003 15:00:15.943061 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-x6k8t" Oct 03 15:00:15 crc kubenswrapper[4774]: I1003 15:00:15.946358 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 15:00:15 crc kubenswrapper[4774]: I1003 15:00:15.946575 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 15:00:15 crc kubenswrapper[4774]: I1003 15:00:15.946829 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-5rssc" Oct 03 15:00:15 crc kubenswrapper[4774]: I1003 15:00:15.946926 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 15:00:15 crc kubenswrapper[4774]: I1003 15:00:15.964623 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-x6k8t"] Oct 03 15:00:15 crc kubenswrapper[4774]: I1003 15:00:15.979546 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4t58x" Oct 03 15:00:15 crc kubenswrapper[4774]: I1003 15:00:15.986336 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8wp2\" (UniqueName: \"kubernetes.io/projected/13144fd0-9c06-480f-8809-1021c8f2ccd3-kube-api-access-d8wp2\") pod \"neutron-db-create-7qrp9\" (UID: \"13144fd0-9c06-480f-8809-1021c8f2ccd3\") " pod="openstack/neutron-db-create-7qrp9" Oct 03 15:00:16 crc kubenswrapper[4774]: I1003 15:00:16.087844 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv5p6\" (UniqueName: \"kubernetes.io/projected/10568dd1-d320-4fed-b12b-7ded3500a3e9-kube-api-access-hv5p6\") pod \"keystone-db-sync-x6k8t\" (UID: \"10568dd1-d320-4fed-b12b-7ded3500a3e9\") " pod="openstack/keystone-db-sync-x6k8t" Oct 03 15:00:16 crc kubenswrapper[4774]: I1003 15:00:16.087963 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10568dd1-d320-4fed-b12b-7ded3500a3e9-combined-ca-bundle\") pod \"keystone-db-sync-x6k8t\" (UID: \"10568dd1-d320-4fed-b12b-7ded3500a3e9\") " pod="openstack/keystone-db-sync-x6k8t" Oct 03 15:00:16 crc kubenswrapper[4774]: I1003 15:00:16.088220 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8wp2\" (UniqueName: \"kubernetes.io/projected/13144fd0-9c06-480f-8809-1021c8f2ccd3-kube-api-access-d8wp2\") pod \"neutron-db-create-7qrp9\" (UID: \"13144fd0-9c06-480f-8809-1021c8f2ccd3\") " pod="openstack/neutron-db-create-7qrp9" Oct 03 15:00:16 crc kubenswrapper[4774]: I1003 15:00:16.088264 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10568dd1-d320-4fed-b12b-7ded3500a3e9-config-data\") pod \"keystone-db-sync-x6k8t\" (UID: \"10568dd1-d320-4fed-b12b-7ded3500a3e9\") " pod="openstack/keystone-db-sync-x6k8t" Oct 03 15:00:16 crc kubenswrapper[4774]: I1003 15:00:16.124815 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8wp2\" (UniqueName: \"kubernetes.io/projected/13144fd0-9c06-480f-8809-1021c8f2ccd3-kube-api-access-d8wp2\") pod \"neutron-db-create-7qrp9\" (UID: \"13144fd0-9c06-480f-8809-1021c8f2ccd3\") " pod="openstack/neutron-db-create-7qrp9" Oct 03 15:00:16 crc kubenswrapper[4774]: I1003 15:00:16.179884 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7qrp9" Oct 03 15:00:16 crc kubenswrapper[4774]: I1003 15:00:16.189991 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv5p6\" (UniqueName: \"kubernetes.io/projected/10568dd1-d320-4fed-b12b-7ded3500a3e9-kube-api-access-hv5p6\") pod \"keystone-db-sync-x6k8t\" (UID: \"10568dd1-d320-4fed-b12b-7ded3500a3e9\") " pod="openstack/keystone-db-sync-x6k8t" Oct 03 15:00:16 crc kubenswrapper[4774]: I1003 15:00:16.190069 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10568dd1-d320-4fed-b12b-7ded3500a3e9-combined-ca-bundle\") pod \"keystone-db-sync-x6k8t\" (UID: \"10568dd1-d320-4fed-b12b-7ded3500a3e9\") " pod="openstack/keystone-db-sync-x6k8t" Oct 03 15:00:16 crc kubenswrapper[4774]: I1003 15:00:16.190145 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10568dd1-d320-4fed-b12b-7ded3500a3e9-config-data\") pod \"keystone-db-sync-x6k8t\" (UID: \"10568dd1-d320-4fed-b12b-7ded3500a3e9\") " pod="openstack/keystone-db-sync-x6k8t" Oct 03 15:00:16 crc kubenswrapper[4774]: I1003 15:00:16.194316 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10568dd1-d320-4fed-b12b-7ded3500a3e9-combined-ca-bundle\") pod \"keystone-db-sync-x6k8t\" (UID: \"10568dd1-d320-4fed-b12b-7ded3500a3e9\") " pod="openstack/keystone-db-sync-x6k8t" Oct 03 15:00:16 crc kubenswrapper[4774]: I1003 15:00:16.208794 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10568dd1-d320-4fed-b12b-7ded3500a3e9-config-data\") pod \"keystone-db-sync-x6k8t\" (UID: \"10568dd1-d320-4fed-b12b-7ded3500a3e9\") " pod="openstack/keystone-db-sync-x6k8t" Oct 03 15:00:16 crc kubenswrapper[4774]: I1003 15:00:16.209464 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv5p6\" (UniqueName: \"kubernetes.io/projected/10568dd1-d320-4fed-b12b-7ded3500a3e9-kube-api-access-hv5p6\") pod \"keystone-db-sync-x6k8t\" (UID: \"10568dd1-d320-4fed-b12b-7ded3500a3e9\") " pod="openstack/keystone-db-sync-x6k8t" Oct 03 15:00:16 crc kubenswrapper[4774]: I1003 15:00:16.272816 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-x6k8t" Oct 03 15:00:18 crc kubenswrapper[4774]: I1003 15:00:18.175320 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-418c-account-create-6xbkh" Oct 03 15:00:18 crc kubenswrapper[4774]: I1003 15:00:18.337975 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99r8c\" (UniqueName: \"kubernetes.io/projected/eff4b471-cbe1-458d-b00a-9ab69a909afc-kube-api-access-99r8c\") pod \"eff4b471-cbe1-458d-b00a-9ab69a909afc\" (UID: \"eff4b471-cbe1-458d-b00a-9ab69a909afc\") " Oct 03 15:00:18 crc kubenswrapper[4774]: I1003 15:00:18.344034 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eff4b471-cbe1-458d-b00a-9ab69a909afc-kube-api-access-99r8c" (OuterVolumeSpecName: "kube-api-access-99r8c") pod "eff4b471-cbe1-458d-b00a-9ab69a909afc" (UID: "eff4b471-cbe1-458d-b00a-9ab69a909afc"). InnerVolumeSpecName "kube-api-access-99r8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:00:18 crc kubenswrapper[4774]: I1003 15:00:18.440740 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99r8c\" (UniqueName: \"kubernetes.io/projected/eff4b471-cbe1-458d-b00a-9ab69a909afc-kube-api-access-99r8c\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:18 crc kubenswrapper[4774]: I1003 15:00:18.662604 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-zhgsz"] Oct 03 15:00:18 crc kubenswrapper[4774]: I1003 15:00:18.671232 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-x6k8t"] Oct 03 15:00:18 crc kubenswrapper[4774]: I1003 15:00:18.776848 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-7qrp9"] Oct 03 15:00:18 crc kubenswrapper[4774]: I1003 15:00:18.784656 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-4t58x"] Oct 03 15:00:18 crc kubenswrapper[4774]: W1003 15:00:18.810817 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13144fd0_9c06_480f_8809_1021c8f2ccd3.slice/crio-1d5924334d2e22d4d8b9a777db0ea5a1cf79eee1367c5a55ee30d4949c6621b5 WatchSource:0}: Error finding container 1d5924334d2e22d4d8b9a777db0ea5a1cf79eee1367c5a55ee30d4949c6621b5: Status 404 returned error can't find the container with id 1d5924334d2e22d4d8b9a777db0ea5a1cf79eee1367c5a55ee30d4949c6621b5 Oct 03 15:00:18 crc kubenswrapper[4774]: W1003 15:00:18.812343 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5dad4cd_6b95_4b92_804a_c94135bff5ae.slice/crio-e558a5de0d7d11a79368e8da295696139cf395faf6aa090a9d0f7352fdf249d3 WatchSource:0}: Error finding container e558a5de0d7d11a79368e8da295696139cf395faf6aa090a9d0f7352fdf249d3: Status 404 returned error can't find the container with id e558a5de0d7d11a79368e8da295696139cf395faf6aa090a9d0f7352fdf249d3 Oct 03 15:00:18 crc kubenswrapper[4774]: I1003 15:00:18.922441 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc0b2c39-8c1e-4401-97f6-a4306b435436","Type":"ContainerStarted","Data":"0c20f50a1b10cdd3d4f2caaa6be7fe224559be31dac9d3dbd73d4d340c0144f7"} Oct 03 15:00:18 crc kubenswrapper[4774]: I1003 15:00:18.922685 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc0b2c39-8c1e-4401-97f6-a4306b435436","Type":"ContainerStarted","Data":"c78af5f52fc011d2311f8319cff1ab09860da3a1050817f9a9da5883a1ffac88"} Oct 03 15:00:18 crc kubenswrapper[4774]: I1003 15:00:18.922695 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc0b2c39-8c1e-4401-97f6-a4306b435436","Type":"ContainerStarted","Data":"efc79ab9a1d8e2999f9559427fa8e6e4d5d2e8becd4a5efe46cf45eeb450140c"} Oct 03 15:00:18 crc kubenswrapper[4774]: I1003 15:00:18.923842 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-k72d4" event={"ID":"6c2fdc49-a155-4a5b-afce-314af82e3f5b","Type":"ContainerStarted","Data":"8c116304c75bf62309ede65631dad7f489deaa5d260dfc5583261549653bb8c9"} Oct 03 15:00:18 crc kubenswrapper[4774]: I1003 15:00:18.927045 4774 generic.go:334] "Generic (PLEG): container finished" podID="bb7d18d2-5f77-4f17-8fce-7ea5b663a23c" containerID="c0f76ae4203ec72e832ffd26f69a5a6ae33ff6fd5a597716122dfcae726f839d" exitCode=0 Oct 03 15:00:18 crc kubenswrapper[4774]: I1003 15:00:18.927112 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-zhgsz" event={"ID":"bb7d18d2-5f77-4f17-8fce-7ea5b663a23c","Type":"ContainerDied","Data":"c0f76ae4203ec72e832ffd26f69a5a6ae33ff6fd5a597716122dfcae726f839d"} Oct 03 15:00:18 crc kubenswrapper[4774]: I1003 15:00:18.927137 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-zhgsz" event={"ID":"bb7d18d2-5f77-4f17-8fce-7ea5b663a23c","Type":"ContainerStarted","Data":"ccff1072948741accf27fd4fdcc5a59ecdfd0186a0ae112832708f690234b96b"} Oct 03 15:00:18 crc kubenswrapper[4774]: I1003 15:00:18.928367 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-418c-account-create-6xbkh" event={"ID":"eff4b471-cbe1-458d-b00a-9ab69a909afc","Type":"ContainerDied","Data":"20943697d99673a1b9dd9d9f1b2dbccce24e43c872629aa9ef4ea5c3b71ee0bb"} Oct 03 15:00:18 crc kubenswrapper[4774]: I1003 15:00:18.928425 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-418c-account-create-6xbkh" Oct 03 15:00:18 crc kubenswrapper[4774]: I1003 15:00:18.928430 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20943697d99673a1b9dd9d9f1b2dbccce24e43c872629aa9ef4ea5c3b71ee0bb" Oct 03 15:00:18 crc kubenswrapper[4774]: I1003 15:00:18.929294 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-x6k8t" event={"ID":"10568dd1-d320-4fed-b12b-7ded3500a3e9","Type":"ContainerStarted","Data":"7d3766c3da9f39ffc2a9aa7672b17876abd4d82f27aaf50e53febaecb4df190b"} Oct 03 15:00:18 crc kubenswrapper[4774]: I1003 15:00:18.931178 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7qrp9" event={"ID":"13144fd0-9c06-480f-8809-1021c8f2ccd3","Type":"ContainerStarted","Data":"1d5924334d2e22d4d8b9a777db0ea5a1cf79eee1367c5a55ee30d4949c6621b5"} Oct 03 15:00:18 crc kubenswrapper[4774]: I1003 15:00:18.932293 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4t58x" event={"ID":"b5dad4cd-6b95-4b92-804a-c94135bff5ae","Type":"ContainerStarted","Data":"e558a5de0d7d11a79368e8da295696139cf395faf6aa090a9d0f7352fdf249d3"} Oct 03 15:00:18 crc kubenswrapper[4774]: I1003 15:00:18.945114 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-k72d4" podStartSLOduration=2.147342269 podStartE2EDuration="16.945096291s" podCreationTimestamp="2025-10-03 15:00:02 +0000 UTC" firstStartedPulling="2025-10-03 15:00:03.362699991 +0000 UTC m=+1025.951903443" lastFinishedPulling="2025-10-03 15:00:18.160453993 +0000 UTC m=+1040.749657465" observedRunningTime="2025-10-03 15:00:18.939897061 +0000 UTC m=+1041.529100533" watchObservedRunningTime="2025-10-03 15:00:18.945096291 +0000 UTC m=+1041.534299753" Oct 03 15:00:19 crc kubenswrapper[4774]: I1003 15:00:19.940132 4774 generic.go:334] "Generic (PLEG): container finished" podID="b5dad4cd-6b95-4b92-804a-c94135bff5ae" containerID="a94a76982647cf91a85aa74e07b8e59fb2f08a059d57bfef113647c736fe4e63" exitCode=0 Oct 03 15:00:19 crc kubenswrapper[4774]: I1003 15:00:19.940220 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4t58x" event={"ID":"b5dad4cd-6b95-4b92-804a-c94135bff5ae","Type":"ContainerDied","Data":"a94a76982647cf91a85aa74e07b8e59fb2f08a059d57bfef113647c736fe4e63"} Oct 03 15:00:19 crc kubenswrapper[4774]: I1003 15:00:19.959024 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc0b2c39-8c1e-4401-97f6-a4306b435436","Type":"ContainerStarted","Data":"b515534f7827df9592ca1f0f5b4cf245624f6fdfe4302c1489c8ff4a2444ce81"} Oct 03 15:00:19 crc kubenswrapper[4774]: I1003 15:00:19.959080 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc0b2c39-8c1e-4401-97f6-a4306b435436","Type":"ContainerStarted","Data":"604bcda47eeecd132e6ab987d50da4aa3679ff632776cb85af5b4a20c14a7d17"} Oct 03 15:00:19 crc kubenswrapper[4774]: I1003 15:00:19.959099 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc0b2c39-8c1e-4401-97f6-a4306b435436","Type":"ContainerStarted","Data":"309b0f7c346301b04200f18f869a23c4bdcce66eda3973306ed7a022f3292a29"} Oct 03 15:00:19 crc kubenswrapper[4774]: I1003 15:00:19.962207 4774 generic.go:334] "Generic (PLEG): container finished" podID="13144fd0-9c06-480f-8809-1021c8f2ccd3" containerID="c900d8688d36e8c1c04f32c90d1c12885d723ac8e764adebcd57c448a592060c" exitCode=0 Oct 03 15:00:19 crc kubenswrapper[4774]: I1003 15:00:19.962278 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7qrp9" event={"ID":"13144fd0-9c06-480f-8809-1021c8f2ccd3","Type":"ContainerDied","Data":"c900d8688d36e8c1c04f32c90d1c12885d723ac8e764adebcd57c448a592060c"} Oct 03 15:00:20 crc kubenswrapper[4774]: I1003 15:00:20.003788 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=45.264443054 podStartE2EDuration="52.003768765s" podCreationTimestamp="2025-10-03 14:59:28 +0000 UTC" firstStartedPulling="2025-10-03 15:00:02.562306601 +0000 UTC m=+1025.151510053" lastFinishedPulling="2025-10-03 15:00:09.301632312 +0000 UTC m=+1031.890835764" observedRunningTime="2025-10-03 15:00:19.986907435 +0000 UTC m=+1042.576110897" watchObservedRunningTime="2025-10-03 15:00:20.003768765 +0000 UTC m=+1042.592972217" Oct 03 15:00:20 crc kubenswrapper[4774]: I1003 15:00:20.262190 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-qlzcz"] Oct 03 15:00:20 crc kubenswrapper[4774]: E1003 15:00:20.262839 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eff4b471-cbe1-458d-b00a-9ab69a909afc" containerName="mariadb-account-create" Oct 03 15:00:20 crc kubenswrapper[4774]: I1003 15:00:20.262969 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="eff4b471-cbe1-458d-b00a-9ab69a909afc" containerName="mariadb-account-create" Oct 03 15:00:20 crc kubenswrapper[4774]: I1003 15:00:20.263266 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="eff4b471-cbe1-458d-b00a-9ab69a909afc" containerName="mariadb-account-create" Oct 03 15:00:20 crc kubenswrapper[4774]: I1003 15:00:20.264447 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-qlzcz" Oct 03 15:00:20 crc kubenswrapper[4774]: I1003 15:00:20.266276 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 03 15:00:20 crc kubenswrapper[4774]: I1003 15:00:20.289122 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-qlzcz"] Oct 03 15:00:20 crc kubenswrapper[4774]: I1003 15:00:20.305492 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p7g9\" (UniqueName: \"kubernetes.io/projected/4790dde0-2293-42c3-b068-19ac0c89968f-kube-api-access-4p7g9\") pod \"dnsmasq-dns-77585f5f8c-qlzcz\" (UID: \"4790dde0-2293-42c3-b068-19ac0c89968f\") " pod="openstack/dnsmasq-dns-77585f5f8c-qlzcz" Oct 03 15:00:20 crc kubenswrapper[4774]: I1003 15:00:20.305761 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4790dde0-2293-42c3-b068-19ac0c89968f-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-qlzcz\" (UID: \"4790dde0-2293-42c3-b068-19ac0c89968f\") " pod="openstack/dnsmasq-dns-77585f5f8c-qlzcz" Oct 03 15:00:20 crc kubenswrapper[4774]: I1003 15:00:20.305830 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4790dde0-2293-42c3-b068-19ac0c89968f-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-qlzcz\" (UID: \"4790dde0-2293-42c3-b068-19ac0c89968f\") " pod="openstack/dnsmasq-dns-77585f5f8c-qlzcz" Oct 03 15:00:20 crc kubenswrapper[4774]: I1003 15:00:20.306017 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4790dde0-2293-42c3-b068-19ac0c89968f-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-qlzcz\" (UID: \"4790dde0-2293-42c3-b068-19ac0c89968f\") " pod="openstack/dnsmasq-dns-77585f5f8c-qlzcz" Oct 03 15:00:20 crc kubenswrapper[4774]: I1003 15:00:20.306078 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4790dde0-2293-42c3-b068-19ac0c89968f-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-qlzcz\" (UID: \"4790dde0-2293-42c3-b068-19ac0c89968f\") " pod="openstack/dnsmasq-dns-77585f5f8c-qlzcz" Oct 03 15:00:20 crc kubenswrapper[4774]: I1003 15:00:20.306133 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4790dde0-2293-42c3-b068-19ac0c89968f-config\") pod \"dnsmasq-dns-77585f5f8c-qlzcz\" (UID: \"4790dde0-2293-42c3-b068-19ac0c89968f\") " pod="openstack/dnsmasq-dns-77585f5f8c-qlzcz" Oct 03 15:00:20 crc kubenswrapper[4774]: I1003 15:00:20.326306 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-zhgsz" Oct 03 15:00:20 crc kubenswrapper[4774]: I1003 15:00:20.407869 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cplt2\" (UniqueName: \"kubernetes.io/projected/bb7d18d2-5f77-4f17-8fce-7ea5b663a23c-kube-api-access-cplt2\") pod \"bb7d18d2-5f77-4f17-8fce-7ea5b663a23c\" (UID: \"bb7d18d2-5f77-4f17-8fce-7ea5b663a23c\") " Oct 03 15:00:20 crc kubenswrapper[4774]: I1003 15:00:20.408159 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4790dde0-2293-42c3-b068-19ac0c89968f-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-qlzcz\" (UID: \"4790dde0-2293-42c3-b068-19ac0c89968f\") " pod="openstack/dnsmasq-dns-77585f5f8c-qlzcz" Oct 03 15:00:20 crc kubenswrapper[4774]: I1003 15:00:20.408214 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4790dde0-2293-42c3-b068-19ac0c89968f-config\") pod \"dnsmasq-dns-77585f5f8c-qlzcz\" (UID: \"4790dde0-2293-42c3-b068-19ac0c89968f\") " pod="openstack/dnsmasq-dns-77585f5f8c-qlzcz" Oct 03 15:00:20 crc kubenswrapper[4774]: I1003 15:00:20.408276 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p7g9\" (UniqueName: \"kubernetes.io/projected/4790dde0-2293-42c3-b068-19ac0c89968f-kube-api-access-4p7g9\") pod \"dnsmasq-dns-77585f5f8c-qlzcz\" (UID: \"4790dde0-2293-42c3-b068-19ac0c89968f\") " pod="openstack/dnsmasq-dns-77585f5f8c-qlzcz" Oct 03 15:00:20 crc kubenswrapper[4774]: I1003 15:00:20.408310 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4790dde0-2293-42c3-b068-19ac0c89968f-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-qlzcz\" (UID: \"4790dde0-2293-42c3-b068-19ac0c89968f\") " pod="openstack/dnsmasq-dns-77585f5f8c-qlzcz" Oct 03 15:00:20 crc kubenswrapper[4774]: I1003 15:00:20.408344 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4790dde0-2293-42c3-b068-19ac0c89968f-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-qlzcz\" (UID: \"4790dde0-2293-42c3-b068-19ac0c89968f\") " pod="openstack/dnsmasq-dns-77585f5f8c-qlzcz" Oct 03 15:00:20 crc kubenswrapper[4774]: I1003 15:00:20.408446 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4790dde0-2293-42c3-b068-19ac0c89968f-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-qlzcz\" (UID: \"4790dde0-2293-42c3-b068-19ac0c89968f\") " pod="openstack/dnsmasq-dns-77585f5f8c-qlzcz" Oct 03 15:00:20 crc kubenswrapper[4774]: I1003 15:00:20.409578 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4790dde0-2293-42c3-b068-19ac0c89968f-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-qlzcz\" (UID: \"4790dde0-2293-42c3-b068-19ac0c89968f\") " pod="openstack/dnsmasq-dns-77585f5f8c-qlzcz" Oct 03 15:00:20 crc kubenswrapper[4774]: I1003 15:00:20.409608 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4790dde0-2293-42c3-b068-19ac0c89968f-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-qlzcz\" (UID: \"4790dde0-2293-42c3-b068-19ac0c89968f\") " pod="openstack/dnsmasq-dns-77585f5f8c-qlzcz" Oct 03 15:00:20 crc kubenswrapper[4774]: I1003 15:00:20.410053 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4790dde0-2293-42c3-b068-19ac0c89968f-config\") pod \"dnsmasq-dns-77585f5f8c-qlzcz\" (UID: \"4790dde0-2293-42c3-b068-19ac0c89968f\") " pod="openstack/dnsmasq-dns-77585f5f8c-qlzcz" Oct 03 15:00:20 crc kubenswrapper[4774]: I1003 15:00:20.410152 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4790dde0-2293-42c3-b068-19ac0c89968f-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-qlzcz\" (UID: \"4790dde0-2293-42c3-b068-19ac0c89968f\") " pod="openstack/dnsmasq-dns-77585f5f8c-qlzcz" Oct 03 15:00:20 crc kubenswrapper[4774]: I1003 15:00:20.410580 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4790dde0-2293-42c3-b068-19ac0c89968f-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-qlzcz\" (UID: \"4790dde0-2293-42c3-b068-19ac0c89968f\") " pod="openstack/dnsmasq-dns-77585f5f8c-qlzcz" Oct 03 15:00:20 crc kubenswrapper[4774]: I1003 15:00:20.414617 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb7d18d2-5f77-4f17-8fce-7ea5b663a23c-kube-api-access-cplt2" (OuterVolumeSpecName: "kube-api-access-cplt2") pod "bb7d18d2-5f77-4f17-8fce-7ea5b663a23c" (UID: "bb7d18d2-5f77-4f17-8fce-7ea5b663a23c"). InnerVolumeSpecName "kube-api-access-cplt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:00:20 crc kubenswrapper[4774]: I1003 15:00:20.431653 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p7g9\" (UniqueName: \"kubernetes.io/projected/4790dde0-2293-42c3-b068-19ac0c89968f-kube-api-access-4p7g9\") pod \"dnsmasq-dns-77585f5f8c-qlzcz\" (UID: \"4790dde0-2293-42c3-b068-19ac0c89968f\") " pod="openstack/dnsmasq-dns-77585f5f8c-qlzcz" Oct 03 15:00:20 crc kubenswrapper[4774]: I1003 15:00:20.509361 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cplt2\" (UniqueName: \"kubernetes.io/projected/bb7d18d2-5f77-4f17-8fce-7ea5b663a23c-kube-api-access-cplt2\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:20 crc kubenswrapper[4774]: I1003 15:00:20.636594 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-qlzcz" Oct 03 15:00:20 crc kubenswrapper[4774]: I1003 15:00:20.971526 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-zhgsz" Oct 03 15:00:20 crc kubenswrapper[4774]: I1003 15:00:20.972964 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-zhgsz" event={"ID":"bb7d18d2-5f77-4f17-8fce-7ea5b663a23c","Type":"ContainerDied","Data":"ccff1072948741accf27fd4fdcc5a59ecdfd0186a0ae112832708f690234b96b"} Oct 03 15:00:20 crc kubenswrapper[4774]: I1003 15:00:20.973011 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ccff1072948741accf27fd4fdcc5a59ecdfd0186a0ae112832708f690234b96b" Oct 03 15:00:21 crc kubenswrapper[4774]: I1003 15:00:21.097297 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-qlzcz"] Oct 03 15:00:23 crc kubenswrapper[4774]: W1003 15:00:23.663547 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4790dde0_2293_42c3_b068_19ac0c89968f.slice/crio-375930ff9392b6965f4f7851b7c3bd8c1fbdf67d67bb00b397ba9e60f3e9b7ee WatchSource:0}: Error finding container 375930ff9392b6965f4f7851b7c3bd8c1fbdf67d67bb00b397ba9e60f3e9b7ee: Status 404 returned error can't find the container with id 375930ff9392b6965f4f7851b7c3bd8c1fbdf67d67bb00b397ba9e60f3e9b7ee Oct 03 15:00:23 crc kubenswrapper[4774]: I1003 15:00:23.919937 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7qrp9" Oct 03 15:00:23 crc kubenswrapper[4774]: I1003 15:00:23.949297 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4t58x" Oct 03 15:00:23 crc kubenswrapper[4774]: I1003 15:00:23.966693 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfjqx\" (UniqueName: \"kubernetes.io/projected/b5dad4cd-6b95-4b92-804a-c94135bff5ae-kube-api-access-wfjqx\") pod \"b5dad4cd-6b95-4b92-804a-c94135bff5ae\" (UID: \"b5dad4cd-6b95-4b92-804a-c94135bff5ae\") " Oct 03 15:00:23 crc kubenswrapper[4774]: I1003 15:00:23.966944 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8wp2\" (UniqueName: \"kubernetes.io/projected/13144fd0-9c06-480f-8809-1021c8f2ccd3-kube-api-access-d8wp2\") pod \"13144fd0-9c06-480f-8809-1021c8f2ccd3\" (UID: \"13144fd0-9c06-480f-8809-1021c8f2ccd3\") " Oct 03 15:00:23 crc kubenswrapper[4774]: I1003 15:00:23.983239 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5dad4cd-6b95-4b92-804a-c94135bff5ae-kube-api-access-wfjqx" (OuterVolumeSpecName: "kube-api-access-wfjqx") pod "b5dad4cd-6b95-4b92-804a-c94135bff5ae" (UID: "b5dad4cd-6b95-4b92-804a-c94135bff5ae"). InnerVolumeSpecName "kube-api-access-wfjqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:00:23 crc kubenswrapper[4774]: I1003 15:00:23.987101 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13144fd0-9c06-480f-8809-1021c8f2ccd3-kube-api-access-d8wp2" (OuterVolumeSpecName: "kube-api-access-d8wp2") pod "13144fd0-9c06-480f-8809-1021c8f2ccd3" (UID: "13144fd0-9c06-480f-8809-1021c8f2ccd3"). InnerVolumeSpecName "kube-api-access-d8wp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:00:23 crc kubenswrapper[4774]: I1003 15:00:23.998403 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-qlzcz" event={"ID":"4790dde0-2293-42c3-b068-19ac0c89968f","Type":"ContainerStarted","Data":"375930ff9392b6965f4f7851b7c3bd8c1fbdf67d67bb00b397ba9e60f3e9b7ee"} Oct 03 15:00:24 crc kubenswrapper[4774]: I1003 15:00:24.000464 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4t58x" event={"ID":"b5dad4cd-6b95-4b92-804a-c94135bff5ae","Type":"ContainerDied","Data":"e558a5de0d7d11a79368e8da295696139cf395faf6aa090a9d0f7352fdf249d3"} Oct 03 15:00:24 crc kubenswrapper[4774]: I1003 15:00:24.000535 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e558a5de0d7d11a79368e8da295696139cf395faf6aa090a9d0f7352fdf249d3" Oct 03 15:00:24 crc kubenswrapper[4774]: I1003 15:00:24.000635 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4t58x" Oct 03 15:00:24 crc kubenswrapper[4774]: I1003 15:00:24.006459 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7qrp9" event={"ID":"13144fd0-9c06-480f-8809-1021c8f2ccd3","Type":"ContainerDied","Data":"1d5924334d2e22d4d8b9a777db0ea5a1cf79eee1367c5a55ee30d4949c6621b5"} Oct 03 15:00:24 crc kubenswrapper[4774]: I1003 15:00:24.006496 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d5924334d2e22d4d8b9a777db0ea5a1cf79eee1367c5a55ee30d4949c6621b5" Oct 03 15:00:24 crc kubenswrapper[4774]: I1003 15:00:24.006553 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7qrp9" Oct 03 15:00:24 crc kubenswrapper[4774]: I1003 15:00:24.075748 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfjqx\" (UniqueName: \"kubernetes.io/projected/b5dad4cd-6b95-4b92-804a-c94135bff5ae-kube-api-access-wfjqx\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:24 crc kubenswrapper[4774]: I1003 15:00:24.075901 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8wp2\" (UniqueName: \"kubernetes.io/projected/13144fd0-9c06-480f-8809-1021c8f2ccd3-kube-api-access-d8wp2\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:25 crc kubenswrapper[4774]: I1003 15:00:25.022068 4774 generic.go:334] "Generic (PLEG): container finished" podID="4790dde0-2293-42c3-b068-19ac0c89968f" containerID="f417c59c012e5a0de4b6a1e81fbc738bfcbc98a00c4988c6b154619536ee031a" exitCode=0 Oct 03 15:00:25 crc kubenswrapper[4774]: I1003 15:00:25.022168 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-qlzcz" event={"ID":"4790dde0-2293-42c3-b068-19ac0c89968f","Type":"ContainerDied","Data":"f417c59c012e5a0de4b6a1e81fbc738bfcbc98a00c4988c6b154619536ee031a"} Oct 03 15:00:25 crc kubenswrapper[4774]: I1003 15:00:25.023384 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-x6k8t" event={"ID":"10568dd1-d320-4fed-b12b-7ded3500a3e9","Type":"ContainerStarted","Data":"6fb064fe23f4aa8e4bd57b9b12104c84f6132e63dc26661ddd367b3baae405dc"} Oct 03 15:00:25 crc kubenswrapper[4774]: I1003 15:00:25.079947 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-x6k8t" podStartSLOduration=5.038305255 podStartE2EDuration="10.079927562s" podCreationTimestamp="2025-10-03 15:00:15 +0000 UTC" firstStartedPulling="2025-10-03 15:00:18.673429003 +0000 UTC m=+1041.262632455" lastFinishedPulling="2025-10-03 15:00:23.71505131 +0000 UTC m=+1046.304254762" observedRunningTime="2025-10-03 15:00:25.07543949 +0000 UTC m=+1047.664642942" watchObservedRunningTime="2025-10-03 15:00:25.079927562 +0000 UTC m=+1047.669131014" Oct 03 15:00:25 crc kubenswrapper[4774]: I1003 15:00:25.705261 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-a1da-account-create-z4r4k"] Oct 03 15:00:25 crc kubenswrapper[4774]: E1003 15:00:25.705976 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13144fd0-9c06-480f-8809-1021c8f2ccd3" containerName="mariadb-database-create" Oct 03 15:00:25 crc kubenswrapper[4774]: I1003 15:00:25.705996 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="13144fd0-9c06-480f-8809-1021c8f2ccd3" containerName="mariadb-database-create" Oct 03 15:00:25 crc kubenswrapper[4774]: E1003 15:00:25.706011 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb7d18d2-5f77-4f17-8fce-7ea5b663a23c" containerName="mariadb-database-create" Oct 03 15:00:25 crc kubenswrapper[4774]: I1003 15:00:25.706017 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb7d18d2-5f77-4f17-8fce-7ea5b663a23c" containerName="mariadb-database-create" Oct 03 15:00:25 crc kubenswrapper[4774]: E1003 15:00:25.706025 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5dad4cd-6b95-4b92-804a-c94135bff5ae" containerName="mariadb-database-create" Oct 03 15:00:25 crc kubenswrapper[4774]: I1003 15:00:25.706031 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5dad4cd-6b95-4b92-804a-c94135bff5ae" containerName="mariadb-database-create" Oct 03 15:00:25 crc kubenswrapper[4774]: I1003 15:00:25.706174 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="13144fd0-9c06-480f-8809-1021c8f2ccd3" containerName="mariadb-database-create" Oct 03 15:00:25 crc kubenswrapper[4774]: I1003 15:00:25.706186 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5dad4cd-6b95-4b92-804a-c94135bff5ae" containerName="mariadb-database-create" Oct 03 15:00:25 crc kubenswrapper[4774]: I1003 15:00:25.706206 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb7d18d2-5f77-4f17-8fce-7ea5b663a23c" containerName="mariadb-database-create" Oct 03 15:00:25 crc kubenswrapper[4774]: I1003 15:00:25.706765 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a1da-account-create-z4r4k" Oct 03 15:00:25 crc kubenswrapper[4774]: I1003 15:00:25.708998 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 03 15:00:25 crc kubenswrapper[4774]: I1003 15:00:25.722608 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a1da-account-create-z4r4k"] Oct 03 15:00:25 crc kubenswrapper[4774]: I1003 15:00:25.808515 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw6lk\" (UniqueName: \"kubernetes.io/projected/66ac98ed-7e19-455d-825f-87ef2b381b43-kube-api-access-hw6lk\") pod \"cinder-a1da-account-create-z4r4k\" (UID: \"66ac98ed-7e19-455d-825f-87ef2b381b43\") " pod="openstack/cinder-a1da-account-create-z4r4k" Oct 03 15:00:25 crc kubenswrapper[4774]: I1003 15:00:25.910709 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw6lk\" (UniqueName: \"kubernetes.io/projected/66ac98ed-7e19-455d-825f-87ef2b381b43-kube-api-access-hw6lk\") pod \"cinder-a1da-account-create-z4r4k\" (UID: \"66ac98ed-7e19-455d-825f-87ef2b381b43\") " pod="openstack/cinder-a1da-account-create-z4r4k" Oct 03 15:00:25 crc kubenswrapper[4774]: I1003 15:00:25.936328 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw6lk\" (UniqueName: \"kubernetes.io/projected/66ac98ed-7e19-455d-825f-87ef2b381b43-kube-api-access-hw6lk\") pod \"cinder-a1da-account-create-z4r4k\" (UID: \"66ac98ed-7e19-455d-825f-87ef2b381b43\") " pod="openstack/cinder-a1da-account-create-z4r4k" Oct 03 15:00:26 crc kubenswrapper[4774]: I1003 15:00:26.022982 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a1da-account-create-z4r4k" Oct 03 15:00:26 crc kubenswrapper[4774]: I1003 15:00:26.041942 4774 generic.go:334] "Generic (PLEG): container finished" podID="6c2fdc49-a155-4a5b-afce-314af82e3f5b" containerID="8c116304c75bf62309ede65631dad7f489deaa5d260dfc5583261549653bb8c9" exitCode=0 Oct 03 15:00:26 crc kubenswrapper[4774]: I1003 15:00:26.042005 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-k72d4" event={"ID":"6c2fdc49-a155-4a5b-afce-314af82e3f5b","Type":"ContainerDied","Data":"8c116304c75bf62309ede65631dad7f489deaa5d260dfc5583261549653bb8c9"} Oct 03 15:00:26 crc kubenswrapper[4774]: I1003 15:00:26.044667 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-qlzcz" event={"ID":"4790dde0-2293-42c3-b068-19ac0c89968f","Type":"ContainerStarted","Data":"906e100b23bc124b02c423a32c15de8d18e6674c08ab161e8c49adaa9464df7a"} Oct 03 15:00:26 crc kubenswrapper[4774]: I1003 15:00:26.044748 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-qlzcz" Oct 03 15:00:26 crc kubenswrapper[4774]: I1003 15:00:26.087225 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77585f5f8c-qlzcz" podStartSLOduration=6.087202355 podStartE2EDuration="6.087202355s" podCreationTimestamp="2025-10-03 15:00:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:00:26.086290222 +0000 UTC m=+1048.675493714" watchObservedRunningTime="2025-10-03 15:00:26.087202355 +0000 UTC m=+1048.676405807" Oct 03 15:00:26 crc kubenswrapper[4774]: I1003 15:00:26.493615 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a1da-account-create-z4r4k"] Oct 03 15:00:27 crc kubenswrapper[4774]: I1003 15:00:27.053445 4774 generic.go:334] "Generic (PLEG): container finished" podID="10568dd1-d320-4fed-b12b-7ded3500a3e9" containerID="6fb064fe23f4aa8e4bd57b9b12104c84f6132e63dc26661ddd367b3baae405dc" exitCode=0 Oct 03 15:00:27 crc kubenswrapper[4774]: I1003 15:00:27.053578 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-x6k8t" event={"ID":"10568dd1-d320-4fed-b12b-7ded3500a3e9","Type":"ContainerDied","Data":"6fb064fe23f4aa8e4bd57b9b12104c84f6132e63dc26661ddd367b3baae405dc"} Oct 03 15:00:27 crc kubenswrapper[4774]: I1003 15:00:27.056714 4774 generic.go:334] "Generic (PLEG): container finished" podID="66ac98ed-7e19-455d-825f-87ef2b381b43" containerID="7b39a8fe987c0e29a566f711e63a46f7eb92578ea76b8c8df8f17634c9385a91" exitCode=0 Oct 03 15:00:27 crc kubenswrapper[4774]: I1003 15:00:27.056831 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a1da-account-create-z4r4k" event={"ID":"66ac98ed-7e19-455d-825f-87ef2b381b43","Type":"ContainerDied","Data":"7b39a8fe987c0e29a566f711e63a46f7eb92578ea76b8c8df8f17634c9385a91"} Oct 03 15:00:27 crc kubenswrapper[4774]: I1003 15:00:27.056909 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a1da-account-create-z4r4k" event={"ID":"66ac98ed-7e19-455d-825f-87ef2b381b43","Type":"ContainerStarted","Data":"0a3f442d30bf4f33bdcda7e038e372adaaecceff1bcd337cea4a00e4a024e370"} Oct 03 15:00:27 crc kubenswrapper[4774]: I1003 15:00:27.497807 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-k72d4" Oct 03 15:00:27 crc kubenswrapper[4774]: I1003 15:00:27.538027 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmhlw\" (UniqueName: \"kubernetes.io/projected/6c2fdc49-a155-4a5b-afce-314af82e3f5b-kube-api-access-wmhlw\") pod \"6c2fdc49-a155-4a5b-afce-314af82e3f5b\" (UID: \"6c2fdc49-a155-4a5b-afce-314af82e3f5b\") " Oct 03 15:00:27 crc kubenswrapper[4774]: I1003 15:00:27.538155 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6c2fdc49-a155-4a5b-afce-314af82e3f5b-db-sync-config-data\") pod \"6c2fdc49-a155-4a5b-afce-314af82e3f5b\" (UID: \"6c2fdc49-a155-4a5b-afce-314af82e3f5b\") " Oct 03 15:00:27 crc kubenswrapper[4774]: I1003 15:00:27.538231 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c2fdc49-a155-4a5b-afce-314af82e3f5b-combined-ca-bundle\") pod \"6c2fdc49-a155-4a5b-afce-314af82e3f5b\" (UID: \"6c2fdc49-a155-4a5b-afce-314af82e3f5b\") " Oct 03 15:00:27 crc kubenswrapper[4774]: I1003 15:00:27.538305 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c2fdc49-a155-4a5b-afce-314af82e3f5b-config-data\") pod \"6c2fdc49-a155-4a5b-afce-314af82e3f5b\" (UID: \"6c2fdc49-a155-4a5b-afce-314af82e3f5b\") " Oct 03 15:00:27 crc kubenswrapper[4774]: I1003 15:00:27.544065 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c2fdc49-a155-4a5b-afce-314af82e3f5b-kube-api-access-wmhlw" (OuterVolumeSpecName: "kube-api-access-wmhlw") pod "6c2fdc49-a155-4a5b-afce-314af82e3f5b" (UID: "6c2fdc49-a155-4a5b-afce-314af82e3f5b"). InnerVolumeSpecName "kube-api-access-wmhlw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:00:27 crc kubenswrapper[4774]: I1003 15:00:27.544362 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c2fdc49-a155-4a5b-afce-314af82e3f5b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6c2fdc49-a155-4a5b-afce-314af82e3f5b" (UID: "6c2fdc49-a155-4a5b-afce-314af82e3f5b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:00:27 crc kubenswrapper[4774]: I1003 15:00:27.568737 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c2fdc49-a155-4a5b-afce-314af82e3f5b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c2fdc49-a155-4a5b-afce-314af82e3f5b" (UID: "6c2fdc49-a155-4a5b-afce-314af82e3f5b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:00:27 crc kubenswrapper[4774]: I1003 15:00:27.592340 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c2fdc49-a155-4a5b-afce-314af82e3f5b-config-data" (OuterVolumeSpecName: "config-data") pod "6c2fdc49-a155-4a5b-afce-314af82e3f5b" (UID: "6c2fdc49-a155-4a5b-afce-314af82e3f5b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:00:27 crc kubenswrapper[4774]: I1003 15:00:27.639996 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c2fdc49-a155-4a5b-afce-314af82e3f5b-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:27 crc kubenswrapper[4774]: I1003 15:00:27.640232 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmhlw\" (UniqueName: \"kubernetes.io/projected/6c2fdc49-a155-4a5b-afce-314af82e3f5b-kube-api-access-wmhlw\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:27 crc kubenswrapper[4774]: I1003 15:00:27.640295 4774 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6c2fdc49-a155-4a5b-afce-314af82e3f5b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:27 crc kubenswrapper[4774]: I1003 15:00:27.640413 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c2fdc49-a155-4a5b-afce-314af82e3f5b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:28 crc kubenswrapper[4774]: I1003 15:00:28.065322 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-k72d4" event={"ID":"6c2fdc49-a155-4a5b-afce-314af82e3f5b","Type":"ContainerDied","Data":"544d3a3f81f69935528a83f365ce7c5a078a316509b6924f416e1dbd7109af4e"} Oct 03 15:00:28 crc kubenswrapper[4774]: I1003 15:00:28.065391 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="544d3a3f81f69935528a83f365ce7c5a078a316509b6924f416e1dbd7109af4e" Oct 03 15:00:28 crc kubenswrapper[4774]: I1003 15:00:28.067438 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-k72d4" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.532951 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-qlzcz"] Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.533541 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77585f5f8c-qlzcz" podUID="4790dde0-2293-42c3-b068-19ac0c89968f" containerName="dnsmasq-dns" containerID="cri-o://906e100b23bc124b02c423a32c15de8d18e6674c08ab161e8c49adaa9464df7a" gracePeriod=10 Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.587798 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-7kxbv"] Oct 03 15:00:29 crc kubenswrapper[4774]: E1003 15:00:28.588256 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c2fdc49-a155-4a5b-afce-314af82e3f5b" containerName="glance-db-sync" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.588272 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c2fdc49-a155-4a5b-afce-314af82e3f5b" containerName="glance-db-sync" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.588522 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c2fdc49-a155-4a5b-afce-314af82e3f5b" containerName="glance-db-sync" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.589572 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-7kxbv" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.589866 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-x6k8t" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.594944 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-7kxbv"] Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.601109 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a1da-account-create-z4r4k" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.683471 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv5p6\" (UniqueName: \"kubernetes.io/projected/10568dd1-d320-4fed-b12b-7ded3500a3e9-kube-api-access-hv5p6\") pod \"10568dd1-d320-4fed-b12b-7ded3500a3e9\" (UID: \"10568dd1-d320-4fed-b12b-7ded3500a3e9\") " Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.683632 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10568dd1-d320-4fed-b12b-7ded3500a3e9-combined-ca-bundle\") pod \"10568dd1-d320-4fed-b12b-7ded3500a3e9\" (UID: \"10568dd1-d320-4fed-b12b-7ded3500a3e9\") " Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.683666 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10568dd1-d320-4fed-b12b-7ded3500a3e9-config-data\") pod \"10568dd1-d320-4fed-b12b-7ded3500a3e9\" (UID: \"10568dd1-d320-4fed-b12b-7ded3500a3e9\") " Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.683888 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xznrr\" (UniqueName: \"kubernetes.io/projected/17498ea5-9a08-42d6-bf48-e2eeca875964-kube-api-access-xznrr\") pod \"dnsmasq-dns-7ff5475cc9-7kxbv\" (UID: \"17498ea5-9a08-42d6-bf48-e2eeca875964\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7kxbv" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.683916 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17498ea5-9a08-42d6-bf48-e2eeca875964-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-7kxbv\" (UID: \"17498ea5-9a08-42d6-bf48-e2eeca875964\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7kxbv" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.683931 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17498ea5-9a08-42d6-bf48-e2eeca875964-config\") pod \"dnsmasq-dns-7ff5475cc9-7kxbv\" (UID: \"17498ea5-9a08-42d6-bf48-e2eeca875964\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7kxbv" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.683967 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17498ea5-9a08-42d6-bf48-e2eeca875964-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-7kxbv\" (UID: \"17498ea5-9a08-42d6-bf48-e2eeca875964\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7kxbv" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.683992 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17498ea5-9a08-42d6-bf48-e2eeca875964-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-7kxbv\" (UID: \"17498ea5-9a08-42d6-bf48-e2eeca875964\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7kxbv" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.684013 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17498ea5-9a08-42d6-bf48-e2eeca875964-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-7kxbv\" (UID: \"17498ea5-9a08-42d6-bf48-e2eeca875964\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7kxbv" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.719087 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10568dd1-d320-4fed-b12b-7ded3500a3e9-kube-api-access-hv5p6" (OuterVolumeSpecName: "kube-api-access-hv5p6") pod "10568dd1-d320-4fed-b12b-7ded3500a3e9" (UID: "10568dd1-d320-4fed-b12b-7ded3500a3e9"). InnerVolumeSpecName "kube-api-access-hv5p6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.741636 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10568dd1-d320-4fed-b12b-7ded3500a3e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10568dd1-d320-4fed-b12b-7ded3500a3e9" (UID: "10568dd1-d320-4fed-b12b-7ded3500a3e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.754528 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10568dd1-d320-4fed-b12b-7ded3500a3e9-config-data" (OuterVolumeSpecName: "config-data") pod "10568dd1-d320-4fed-b12b-7ded3500a3e9" (UID: "10568dd1-d320-4fed-b12b-7ded3500a3e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.784954 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hw6lk\" (UniqueName: \"kubernetes.io/projected/66ac98ed-7e19-455d-825f-87ef2b381b43-kube-api-access-hw6lk\") pod \"66ac98ed-7e19-455d-825f-87ef2b381b43\" (UID: \"66ac98ed-7e19-455d-825f-87ef2b381b43\") " Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.785340 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xznrr\" (UniqueName: \"kubernetes.io/projected/17498ea5-9a08-42d6-bf48-e2eeca875964-kube-api-access-xznrr\") pod \"dnsmasq-dns-7ff5475cc9-7kxbv\" (UID: \"17498ea5-9a08-42d6-bf48-e2eeca875964\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7kxbv" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.785403 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17498ea5-9a08-42d6-bf48-e2eeca875964-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-7kxbv\" (UID: \"17498ea5-9a08-42d6-bf48-e2eeca875964\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7kxbv" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.785420 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17498ea5-9a08-42d6-bf48-e2eeca875964-config\") pod \"dnsmasq-dns-7ff5475cc9-7kxbv\" (UID: \"17498ea5-9a08-42d6-bf48-e2eeca875964\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7kxbv" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.785459 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17498ea5-9a08-42d6-bf48-e2eeca875964-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-7kxbv\" (UID: \"17498ea5-9a08-42d6-bf48-e2eeca875964\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7kxbv" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.785484 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17498ea5-9a08-42d6-bf48-e2eeca875964-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-7kxbv\" (UID: \"17498ea5-9a08-42d6-bf48-e2eeca875964\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7kxbv" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.785505 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17498ea5-9a08-42d6-bf48-e2eeca875964-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-7kxbv\" (UID: \"17498ea5-9a08-42d6-bf48-e2eeca875964\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7kxbv" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.785560 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hv5p6\" (UniqueName: \"kubernetes.io/projected/10568dd1-d320-4fed-b12b-7ded3500a3e9-kube-api-access-hv5p6\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.785574 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10568dd1-d320-4fed-b12b-7ded3500a3e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.785584 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10568dd1-d320-4fed-b12b-7ded3500a3e9-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.786826 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17498ea5-9a08-42d6-bf48-e2eeca875964-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-7kxbv\" (UID: \"17498ea5-9a08-42d6-bf48-e2eeca875964\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7kxbv" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.787869 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17498ea5-9a08-42d6-bf48-e2eeca875964-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-7kxbv\" (UID: \"17498ea5-9a08-42d6-bf48-e2eeca875964\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7kxbv" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.787991 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17498ea5-9a08-42d6-bf48-e2eeca875964-config\") pod \"dnsmasq-dns-7ff5475cc9-7kxbv\" (UID: \"17498ea5-9a08-42d6-bf48-e2eeca875964\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7kxbv" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.788355 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17498ea5-9a08-42d6-bf48-e2eeca875964-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-7kxbv\" (UID: \"17498ea5-9a08-42d6-bf48-e2eeca875964\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7kxbv" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.788492 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17498ea5-9a08-42d6-bf48-e2eeca875964-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-7kxbv\" (UID: \"17498ea5-9a08-42d6-bf48-e2eeca875964\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7kxbv" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.791695 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66ac98ed-7e19-455d-825f-87ef2b381b43-kube-api-access-hw6lk" (OuterVolumeSpecName: "kube-api-access-hw6lk") pod "66ac98ed-7e19-455d-825f-87ef2b381b43" (UID: "66ac98ed-7e19-455d-825f-87ef2b381b43"). InnerVolumeSpecName "kube-api-access-hw6lk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.803620 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xznrr\" (UniqueName: \"kubernetes.io/projected/17498ea5-9a08-42d6-bf48-e2eeca875964-kube-api-access-xznrr\") pod \"dnsmasq-dns-7ff5475cc9-7kxbv\" (UID: \"17498ea5-9a08-42d6-bf48-e2eeca875964\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7kxbv" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.887846 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hw6lk\" (UniqueName: \"kubernetes.io/projected/66ac98ed-7e19-455d-825f-87ef2b381b43-kube-api-access-hw6lk\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:28.951709 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-7kxbv" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.083871 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-x6k8t" event={"ID":"10568dd1-d320-4fed-b12b-7ded3500a3e9","Type":"ContainerDied","Data":"7d3766c3da9f39ffc2a9aa7672b17876abd4d82f27aaf50e53febaecb4df190b"} Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.083918 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d3766c3da9f39ffc2a9aa7672b17876abd4d82f27aaf50e53febaecb4df190b" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.083993 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-x6k8t" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.089172 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a1da-account-create-z4r4k" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.089168 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a1da-account-create-z4r4k" event={"ID":"66ac98ed-7e19-455d-825f-87ef2b381b43","Type":"ContainerDied","Data":"0a3f442d30bf4f33bdcda7e038e372adaaecceff1bcd337cea4a00e4a024e370"} Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.089253 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a3f442d30bf4f33bdcda7e038e372adaaecceff1bcd337cea4a00e4a024e370" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.091557 4774 generic.go:334] "Generic (PLEG): container finished" podID="4790dde0-2293-42c3-b068-19ac0c89968f" containerID="906e100b23bc124b02c423a32c15de8d18e6674c08ab161e8c49adaa9464df7a" exitCode=0 Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.091606 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-qlzcz" event={"ID":"4790dde0-2293-42c3-b068-19ac0c89968f","Type":"ContainerDied","Data":"906e100b23bc124b02c423a32c15de8d18e6674c08ab161e8c49adaa9464df7a"} Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.323366 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-qlzcz" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.500209 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p7g9\" (UniqueName: \"kubernetes.io/projected/4790dde0-2293-42c3-b068-19ac0c89968f-kube-api-access-4p7g9\") pod \"4790dde0-2293-42c3-b068-19ac0c89968f\" (UID: \"4790dde0-2293-42c3-b068-19ac0c89968f\") " Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.500340 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4790dde0-2293-42c3-b068-19ac0c89968f-dns-swift-storage-0\") pod \"4790dde0-2293-42c3-b068-19ac0c89968f\" (UID: \"4790dde0-2293-42c3-b068-19ac0c89968f\") " Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.500428 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4790dde0-2293-42c3-b068-19ac0c89968f-ovsdbserver-sb\") pod \"4790dde0-2293-42c3-b068-19ac0c89968f\" (UID: \"4790dde0-2293-42c3-b068-19ac0c89968f\") " Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.500530 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4790dde0-2293-42c3-b068-19ac0c89968f-ovsdbserver-nb\") pod \"4790dde0-2293-42c3-b068-19ac0c89968f\" (UID: \"4790dde0-2293-42c3-b068-19ac0c89968f\") " Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.500593 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4790dde0-2293-42c3-b068-19ac0c89968f-config\") pod \"4790dde0-2293-42c3-b068-19ac0c89968f\" (UID: \"4790dde0-2293-42c3-b068-19ac0c89968f\") " Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.500643 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4790dde0-2293-42c3-b068-19ac0c89968f-dns-svc\") pod \"4790dde0-2293-42c3-b068-19ac0c89968f\" (UID: \"4790dde0-2293-42c3-b068-19ac0c89968f\") " Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.513681 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4790dde0-2293-42c3-b068-19ac0c89968f-kube-api-access-4p7g9" (OuterVolumeSpecName: "kube-api-access-4p7g9") pod "4790dde0-2293-42c3-b068-19ac0c89968f" (UID: "4790dde0-2293-42c3-b068-19ac0c89968f"). InnerVolumeSpecName "kube-api-access-4p7g9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.547390 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4790dde0-2293-42c3-b068-19ac0c89968f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4790dde0-2293-42c3-b068-19ac0c89968f" (UID: "4790dde0-2293-42c3-b068-19ac0c89968f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.553106 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-7kxbv"] Oct 03 15:00:29 crc kubenswrapper[4774]: W1003 15:00:29.561174 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17498ea5_9a08_42d6_bf48_e2eeca875964.slice/crio-abe6e1165c487cd013a66deae7742c4422924d9c0917eecf12c589d1f0a232dc WatchSource:0}: Error finding container abe6e1165c487cd013a66deae7742c4422924d9c0917eecf12c589d1f0a232dc: Status 404 returned error can't find the container with id abe6e1165c487cd013a66deae7742c4422924d9c0917eecf12c589d1f0a232dc Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.570926 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4790dde0-2293-42c3-b068-19ac0c89968f-config" (OuterVolumeSpecName: "config") pod "4790dde0-2293-42c3-b068-19ac0c89968f" (UID: "4790dde0-2293-42c3-b068-19ac0c89968f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.571857 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4790dde0-2293-42c3-b068-19ac0c89968f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4790dde0-2293-42c3-b068-19ac0c89968f" (UID: "4790dde0-2293-42c3-b068-19ac0c89968f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.585516 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4790dde0-2293-42c3-b068-19ac0c89968f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4790dde0-2293-42c3-b068-19ac0c89968f" (UID: "4790dde0-2293-42c3-b068-19ac0c89968f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.594328 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4790dde0-2293-42c3-b068-19ac0c89968f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4790dde0-2293-42c3-b068-19ac0c89968f" (UID: "4790dde0-2293-42c3-b068-19ac0c89968f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.602351 4774 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4790dde0-2293-42c3-b068-19ac0c89968f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.602414 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4790dde0-2293-42c3-b068-19ac0c89968f-config\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.602426 4774 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4790dde0-2293-42c3-b068-19ac0c89968f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.602451 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p7g9\" (UniqueName: \"kubernetes.io/projected/4790dde0-2293-42c3-b068-19ac0c89968f-kube-api-access-4p7g9\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.602481 4774 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4790dde0-2293-42c3-b068-19ac0c89968f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.602490 4774 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4790dde0-2293-42c3-b068-19ac0c89968f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.814577 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-7kxbv"] Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.833421 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-22k8m"] Oct 03 15:00:29 crc kubenswrapper[4774]: E1003 15:00:29.833729 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4790dde0-2293-42c3-b068-19ac0c89968f" containerName="init" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.833745 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="4790dde0-2293-42c3-b068-19ac0c89968f" containerName="init" Oct 03 15:00:29 crc kubenswrapper[4774]: E1003 15:00:29.833763 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66ac98ed-7e19-455d-825f-87ef2b381b43" containerName="mariadb-account-create" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.833771 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="66ac98ed-7e19-455d-825f-87ef2b381b43" containerName="mariadb-account-create" Oct 03 15:00:29 crc kubenswrapper[4774]: E1003 15:00:29.833791 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10568dd1-d320-4fed-b12b-7ded3500a3e9" containerName="keystone-db-sync" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.833797 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="10568dd1-d320-4fed-b12b-7ded3500a3e9" containerName="keystone-db-sync" Oct 03 15:00:29 crc kubenswrapper[4774]: E1003 15:00:29.833816 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4790dde0-2293-42c3-b068-19ac0c89968f" containerName="dnsmasq-dns" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.833823 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="4790dde0-2293-42c3-b068-19ac0c89968f" containerName="dnsmasq-dns" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.833985 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="66ac98ed-7e19-455d-825f-87ef2b381b43" containerName="mariadb-account-create" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.834005 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="4790dde0-2293-42c3-b068-19ac0c89968f" containerName="dnsmasq-dns" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.834015 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="10568dd1-d320-4fed-b12b-7ded3500a3e9" containerName="keystone-db-sync" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.835055 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-22k8m" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.856765 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-vtqvj"] Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.857739 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vtqvj" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.862576 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.862727 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-5rssc" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.862876 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.862991 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.901413 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-22k8m"] Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.939905 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vtqvj"] Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.942182 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6552c40b-eb16-484f-bcbb-064f92daeab8-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-22k8m\" (UID: \"6552c40b-eb16-484f-bcbb-064f92daeab8\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-22k8m" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.942265 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6552c40b-eb16-484f-bcbb-064f92daeab8-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-22k8m\" (UID: \"6552c40b-eb16-484f-bcbb-064f92daeab8\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-22k8m" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.942468 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llrh4\" (UniqueName: \"kubernetes.io/projected/6552c40b-eb16-484f-bcbb-064f92daeab8-kube-api-access-llrh4\") pod \"dnsmasq-dns-5c5cc7c5ff-22k8m\" (UID: \"6552c40b-eb16-484f-bcbb-064f92daeab8\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-22k8m" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.942659 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6552c40b-eb16-484f-bcbb-064f92daeab8-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-22k8m\" (UID: \"6552c40b-eb16-484f-bcbb-064f92daeab8\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-22k8m" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.942718 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6552c40b-eb16-484f-bcbb-064f92daeab8-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-22k8m\" (UID: \"6552c40b-eb16-484f-bcbb-064f92daeab8\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-22k8m" Oct 03 15:00:29 crc kubenswrapper[4774]: I1003 15:00:29.942863 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6552c40b-eb16-484f-bcbb-064f92daeab8-config\") pod \"dnsmasq-dns-5c5cc7c5ff-22k8m\" (UID: \"6552c40b-eb16-484f-bcbb-064f92daeab8\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-22k8m" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.019528 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-587669876f-dmzh9"] Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.021085 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-587669876f-dmzh9" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.032062 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.032315 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-78gw2" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.032543 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.032838 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.033579 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-587669876f-dmzh9"] Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.050929 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f303caa2-7916-4bdc-ba01-b5c55166dc53-scripts\") pod \"keystone-bootstrap-vtqvj\" (UID: \"f303caa2-7916-4bdc-ba01-b5c55166dc53\") " pod="openstack/keystone-bootstrap-vtqvj" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.051003 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6552c40b-eb16-484f-bcbb-064f92daeab8-config\") pod \"dnsmasq-dns-5c5cc7c5ff-22k8m\" (UID: \"6552c40b-eb16-484f-bcbb-064f92daeab8\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-22k8m" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.051039 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glr9f\" (UniqueName: \"kubernetes.io/projected/f303caa2-7916-4bdc-ba01-b5c55166dc53-kube-api-access-glr9f\") pod \"keystone-bootstrap-vtqvj\" (UID: \"f303caa2-7916-4bdc-ba01-b5c55166dc53\") " pod="openstack/keystone-bootstrap-vtqvj" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.051061 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6552c40b-eb16-484f-bcbb-064f92daeab8-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-22k8m\" (UID: \"6552c40b-eb16-484f-bcbb-064f92daeab8\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-22k8m" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.051085 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6552c40b-eb16-484f-bcbb-064f92daeab8-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-22k8m\" (UID: \"6552c40b-eb16-484f-bcbb-064f92daeab8\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-22k8m" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.051106 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b635609a-ab1c-4691-be86-da83abc3e663-horizon-secret-key\") pod \"horizon-587669876f-dmzh9\" (UID: \"b635609a-ab1c-4691-be86-da83abc3e663\") " pod="openstack/horizon-587669876f-dmzh9" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.051128 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f303caa2-7916-4bdc-ba01-b5c55166dc53-config-data\") pod \"keystone-bootstrap-vtqvj\" (UID: \"f303caa2-7916-4bdc-ba01-b5c55166dc53\") " pod="openstack/keystone-bootstrap-vtqvj" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.051149 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b635609a-ab1c-4691-be86-da83abc3e663-scripts\") pod \"horizon-587669876f-dmzh9\" (UID: \"b635609a-ab1c-4691-be86-da83abc3e663\") " pod="openstack/horizon-587669876f-dmzh9" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.051166 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f303caa2-7916-4bdc-ba01-b5c55166dc53-fernet-keys\") pod \"keystone-bootstrap-vtqvj\" (UID: \"f303caa2-7916-4bdc-ba01-b5c55166dc53\") " pod="openstack/keystone-bootstrap-vtqvj" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.051199 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b635609a-ab1c-4691-be86-da83abc3e663-logs\") pod \"horizon-587669876f-dmzh9\" (UID: \"b635609a-ab1c-4691-be86-da83abc3e663\") " pod="openstack/horizon-587669876f-dmzh9" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.051232 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f303caa2-7916-4bdc-ba01-b5c55166dc53-combined-ca-bundle\") pod \"keystone-bootstrap-vtqvj\" (UID: \"f303caa2-7916-4bdc-ba01-b5c55166dc53\") " pod="openstack/keystone-bootstrap-vtqvj" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.051251 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llrh4\" (UniqueName: \"kubernetes.io/projected/6552c40b-eb16-484f-bcbb-064f92daeab8-kube-api-access-llrh4\") pod \"dnsmasq-dns-5c5cc7c5ff-22k8m\" (UID: \"6552c40b-eb16-484f-bcbb-064f92daeab8\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-22k8m" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.051290 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6552c40b-eb16-484f-bcbb-064f92daeab8-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-22k8m\" (UID: \"6552c40b-eb16-484f-bcbb-064f92daeab8\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-22k8m" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.051308 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b635609a-ab1c-4691-be86-da83abc3e663-config-data\") pod \"horizon-587669876f-dmzh9\" (UID: \"b635609a-ab1c-4691-be86-da83abc3e663\") " pod="openstack/horizon-587669876f-dmzh9" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.051326 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bts6v\" (UniqueName: \"kubernetes.io/projected/b635609a-ab1c-4691-be86-da83abc3e663-kube-api-access-bts6v\") pod \"horizon-587669876f-dmzh9\" (UID: \"b635609a-ab1c-4691-be86-da83abc3e663\") " pod="openstack/horizon-587669876f-dmzh9" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.051357 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6552c40b-eb16-484f-bcbb-064f92daeab8-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-22k8m\" (UID: \"6552c40b-eb16-484f-bcbb-064f92daeab8\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-22k8m" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.051448 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f303caa2-7916-4bdc-ba01-b5c55166dc53-credential-keys\") pod \"keystone-bootstrap-vtqvj\" (UID: \"f303caa2-7916-4bdc-ba01-b5c55166dc53\") " pod="openstack/keystone-bootstrap-vtqvj" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.051960 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6552c40b-eb16-484f-bcbb-064f92daeab8-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-22k8m\" (UID: \"6552c40b-eb16-484f-bcbb-064f92daeab8\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-22k8m" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.052508 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6552c40b-eb16-484f-bcbb-064f92daeab8-config\") pod \"dnsmasq-dns-5c5cc7c5ff-22k8m\" (UID: \"6552c40b-eb16-484f-bcbb-064f92daeab8\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-22k8m" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.053193 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6552c40b-eb16-484f-bcbb-064f92daeab8-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-22k8m\" (UID: \"6552c40b-eb16-484f-bcbb-064f92daeab8\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-22k8m" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.053285 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6552c40b-eb16-484f-bcbb-064f92daeab8-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-22k8m\" (UID: \"6552c40b-eb16-484f-bcbb-064f92daeab8\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-22k8m" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.053752 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6552c40b-eb16-484f-bcbb-064f92daeab8-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-22k8m\" (UID: \"6552c40b-eb16-484f-bcbb-064f92daeab8\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-22k8m" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.085251 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llrh4\" (UniqueName: \"kubernetes.io/projected/6552c40b-eb16-484f-bcbb-064f92daeab8-kube-api-access-llrh4\") pod \"dnsmasq-dns-5c5cc7c5ff-22k8m\" (UID: \"6552c40b-eb16-484f-bcbb-064f92daeab8\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-22k8m" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.154648 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b635609a-ab1c-4691-be86-da83abc3e663-config-data\") pod \"horizon-587669876f-dmzh9\" (UID: \"b635609a-ab1c-4691-be86-da83abc3e663\") " pod="openstack/horizon-587669876f-dmzh9" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.154687 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bts6v\" (UniqueName: \"kubernetes.io/projected/b635609a-ab1c-4691-be86-da83abc3e663-kube-api-access-bts6v\") pod \"horizon-587669876f-dmzh9\" (UID: \"b635609a-ab1c-4691-be86-da83abc3e663\") " pod="openstack/horizon-587669876f-dmzh9" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.154717 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f303caa2-7916-4bdc-ba01-b5c55166dc53-credential-keys\") pod \"keystone-bootstrap-vtqvj\" (UID: \"f303caa2-7916-4bdc-ba01-b5c55166dc53\") " pod="openstack/keystone-bootstrap-vtqvj" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.154764 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f303caa2-7916-4bdc-ba01-b5c55166dc53-scripts\") pod \"keystone-bootstrap-vtqvj\" (UID: \"f303caa2-7916-4bdc-ba01-b5c55166dc53\") " pod="openstack/keystone-bootstrap-vtqvj" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.154794 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glr9f\" (UniqueName: \"kubernetes.io/projected/f303caa2-7916-4bdc-ba01-b5c55166dc53-kube-api-access-glr9f\") pod \"keystone-bootstrap-vtqvj\" (UID: \"f303caa2-7916-4bdc-ba01-b5c55166dc53\") " pod="openstack/keystone-bootstrap-vtqvj" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.154820 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b635609a-ab1c-4691-be86-da83abc3e663-horizon-secret-key\") pod \"horizon-587669876f-dmzh9\" (UID: \"b635609a-ab1c-4691-be86-da83abc3e663\") " pod="openstack/horizon-587669876f-dmzh9" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.154836 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f303caa2-7916-4bdc-ba01-b5c55166dc53-config-data\") pod \"keystone-bootstrap-vtqvj\" (UID: \"f303caa2-7916-4bdc-ba01-b5c55166dc53\") " pod="openstack/keystone-bootstrap-vtqvj" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.154855 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b635609a-ab1c-4691-be86-da83abc3e663-scripts\") pod \"horizon-587669876f-dmzh9\" (UID: \"b635609a-ab1c-4691-be86-da83abc3e663\") " pod="openstack/horizon-587669876f-dmzh9" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.154872 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f303caa2-7916-4bdc-ba01-b5c55166dc53-fernet-keys\") pod \"keystone-bootstrap-vtqvj\" (UID: \"f303caa2-7916-4bdc-ba01-b5c55166dc53\") " pod="openstack/keystone-bootstrap-vtqvj" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.154896 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b635609a-ab1c-4691-be86-da83abc3e663-logs\") pod \"horizon-587669876f-dmzh9\" (UID: \"b635609a-ab1c-4691-be86-da83abc3e663\") " pod="openstack/horizon-587669876f-dmzh9" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.154921 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f303caa2-7916-4bdc-ba01-b5c55166dc53-combined-ca-bundle\") pod \"keystone-bootstrap-vtqvj\" (UID: \"f303caa2-7916-4bdc-ba01-b5c55166dc53\") " pod="openstack/keystone-bootstrap-vtqvj" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.157861 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-22k8m" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.159289 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b635609a-ab1c-4691-be86-da83abc3e663-scripts\") pod \"horizon-587669876f-dmzh9\" (UID: \"b635609a-ab1c-4691-be86-da83abc3e663\") " pod="openstack/horizon-587669876f-dmzh9" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.159986 4774 generic.go:334] "Generic (PLEG): container finished" podID="17498ea5-9a08-42d6-bf48-e2eeca875964" containerID="18794906c28148ff0626e9311eb8397e2206e5896bdb5e925fdbea8d99d8b334" exitCode=0 Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.160036 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-7kxbv" event={"ID":"17498ea5-9a08-42d6-bf48-e2eeca875964","Type":"ContainerDied","Data":"18794906c28148ff0626e9311eb8397e2206e5896bdb5e925fdbea8d99d8b334"} Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.160061 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-7kxbv" event={"ID":"17498ea5-9a08-42d6-bf48-e2eeca875964","Type":"ContainerStarted","Data":"abe6e1165c487cd013a66deae7742c4422924d9c0917eecf12c589d1f0a232dc"} Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.165982 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f303caa2-7916-4bdc-ba01-b5c55166dc53-credential-keys\") pod \"keystone-bootstrap-vtqvj\" (UID: \"f303caa2-7916-4bdc-ba01-b5c55166dc53\") " pod="openstack/keystone-bootstrap-vtqvj" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.167202 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b635609a-ab1c-4691-be86-da83abc3e663-logs\") pod \"horizon-587669876f-dmzh9\" (UID: \"b635609a-ab1c-4691-be86-da83abc3e663\") " pod="openstack/horizon-587669876f-dmzh9" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.171275 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b635609a-ab1c-4691-be86-da83abc3e663-config-data\") pod \"horizon-587669876f-dmzh9\" (UID: \"b635609a-ab1c-4691-be86-da83abc3e663\") " pod="openstack/horizon-587669876f-dmzh9" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.172023 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f303caa2-7916-4bdc-ba01-b5c55166dc53-fernet-keys\") pod \"keystone-bootstrap-vtqvj\" (UID: \"f303caa2-7916-4bdc-ba01-b5c55166dc53\") " pod="openstack/keystone-bootstrap-vtqvj" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.175742 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f303caa2-7916-4bdc-ba01-b5c55166dc53-combined-ca-bundle\") pod \"keystone-bootstrap-vtqvj\" (UID: \"f303caa2-7916-4bdc-ba01-b5c55166dc53\") " pod="openstack/keystone-bootstrap-vtqvj" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.188872 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b635609a-ab1c-4691-be86-da83abc3e663-horizon-secret-key\") pod \"horizon-587669876f-dmzh9\" (UID: \"b635609a-ab1c-4691-be86-da83abc3e663\") " pod="openstack/horizon-587669876f-dmzh9" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.189996 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f303caa2-7916-4bdc-ba01-b5c55166dc53-scripts\") pod \"keystone-bootstrap-vtqvj\" (UID: \"f303caa2-7916-4bdc-ba01-b5c55166dc53\") " pod="openstack/keystone-bootstrap-vtqvj" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.192975 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-qlzcz" event={"ID":"4790dde0-2293-42c3-b068-19ac0c89968f","Type":"ContainerDied","Data":"375930ff9392b6965f4f7851b7c3bd8c1fbdf67d67bb00b397ba9e60f3e9b7ee"} Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.193019 4774 scope.go:117] "RemoveContainer" containerID="906e100b23bc124b02c423a32c15de8d18e6674c08ab161e8c49adaa9464df7a" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.193139 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-qlzcz" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.204298 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f303caa2-7916-4bdc-ba01-b5c55166dc53-config-data\") pod \"keystone-bootstrap-vtqvj\" (UID: \"f303caa2-7916-4bdc-ba01-b5c55166dc53\") " pod="openstack/keystone-bootstrap-vtqvj" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.204404 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.214264 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glr9f\" (UniqueName: \"kubernetes.io/projected/f303caa2-7916-4bdc-ba01-b5c55166dc53-kube-api-access-glr9f\") pod \"keystone-bootstrap-vtqvj\" (UID: \"f303caa2-7916-4bdc-ba01-b5c55166dc53\") " pod="openstack/keystone-bootstrap-vtqvj" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.222468 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bts6v\" (UniqueName: \"kubernetes.io/projected/b635609a-ab1c-4691-be86-da83abc3e663-kube-api-access-bts6v\") pod \"horizon-587669876f-dmzh9\" (UID: \"b635609a-ab1c-4691-be86-da83abc3e663\") " pod="openstack/horizon-587669876f-dmzh9" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.242035 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-22k8m"] Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.242156 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.243667 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.243970 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.247145 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vtqvj" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.253915 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.289795 4774 scope.go:117] "RemoveContainer" containerID="f417c59c012e5a0de4b6a1e81fbc738bfcbc98a00c4988c6b154619536ee031a" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.294656 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-f2ct5"] Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.295818 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-f2ct5" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.308834 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.309032 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-tncn6" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.309135 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.311626 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-f2ct5"] Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.341264 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-8gc5c"] Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.349608 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-8gc5c" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.352630 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.357363 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.360419 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.360779 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.361213 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-q7vtm" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.369442 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-587669876f-dmzh9" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.371015 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-8gc5c"] Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.379494 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmndq\" (UniqueName: \"kubernetes.io/projected/d9e41eee-4655-4cc2-b01d-37d1f947011b-kube-api-access-fmndq\") pod \"ceilometer-0\" (UID: \"d9e41eee-4655-4cc2-b01d-37d1f947011b\") " pod="openstack/ceilometer-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.379534 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9e41eee-4655-4cc2-b01d-37d1f947011b-run-httpd\") pod \"ceilometer-0\" (UID: \"d9e41eee-4655-4cc2-b01d-37d1f947011b\") " pod="openstack/ceilometer-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.379551 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9e41eee-4655-4cc2-b01d-37d1f947011b-scripts\") pod \"ceilometer-0\" (UID: \"d9e41eee-4655-4cc2-b01d-37d1f947011b\") " pod="openstack/ceilometer-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.379627 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9e41eee-4655-4cc2-b01d-37d1f947011b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d9e41eee-4655-4cc2-b01d-37d1f947011b\") " pod="openstack/ceilometer-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.379677 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9e41eee-4655-4cc2-b01d-37d1f947011b-config-data\") pod \"ceilometer-0\" (UID: \"d9e41eee-4655-4cc2-b01d-37d1f947011b\") " pod="openstack/ceilometer-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.379695 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9e41eee-4655-4cc2-b01d-37d1f947011b-log-httpd\") pod \"ceilometer-0\" (UID: \"d9e41eee-4655-4cc2-b01d-37d1f947011b\") " pod="openstack/ceilometer-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.379726 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9e41eee-4655-4cc2-b01d-37d1f947011b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d9e41eee-4655-4cc2-b01d-37d1f947011b\") " pod="openstack/ceilometer-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.401245 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-qlzcz"] Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.411901 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.420482 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-qlzcz"] Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.435991 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7c54b8bfd5-ftr47"] Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.437246 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c54b8bfd5-ftr47"] Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.437802 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c54b8bfd5-ftr47" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.447558 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.451947 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.468837 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.477704 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.481411 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce469a02-5678-42c1-84d7-21a29c1b3d18-scripts\") pod \"placement-db-sync-f2ct5\" (UID: \"ce469a02-5678-42c1-84d7-21a29c1b3d18\") " pod="openstack/placement-db-sync-f2ct5" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.481471 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce469a02-5678-42c1-84d7-21a29c1b3d18-combined-ca-bundle\") pod \"placement-db-sync-f2ct5\" (UID: \"ce469a02-5678-42c1-84d7-21a29c1b3d18\") " pod="openstack/placement-db-sync-f2ct5" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.481509 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9e41eee-4655-4cc2-b01d-37d1f947011b-config-data\") pod \"ceilometer-0\" (UID: \"d9e41eee-4655-4cc2-b01d-37d1f947011b\") " pod="openstack/ceilometer-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.481536 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9e41eee-4655-4cc2-b01d-37d1f947011b-log-httpd\") pod \"ceilometer-0\" (UID: \"d9e41eee-4655-4cc2-b01d-37d1f947011b\") " pod="openstack/ceilometer-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.481555 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3308cda8-c038-4fbc-91ad-824ce2c1d85c-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-8gc5c\" (UID: \"3308cda8-c038-4fbc-91ad-824ce2c1d85c\") " pod="openstack/dnsmasq-dns-8b5c85b87-8gc5c" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.481574 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9e41eee-4655-4cc2-b01d-37d1f947011b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d9e41eee-4655-4cc2-b01d-37d1f947011b\") " pod="openstack/ceilometer-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.481601 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.481631 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce469a02-5678-42c1-84d7-21a29c1b3d18-logs\") pod \"placement-db-sync-f2ct5\" (UID: \"ce469a02-5678-42c1-84d7-21a29c1b3d18\") " pod="openstack/placement-db-sync-f2ct5" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.481645 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/86fdf9e4-cf58-46e2-b541-2c03fda113c5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.481668 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86fdf9e4-cf58-46e2-b541-2c03fda113c5-config-data\") pod \"glance-default-external-api-0\" (UID: \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.481685 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3308cda8-c038-4fbc-91ad-824ce2c1d85c-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-8gc5c\" (UID: \"3308cda8-c038-4fbc-91ad-824ce2c1d85c\") " pod="openstack/dnsmasq-dns-8b5c85b87-8gc5c" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.481721 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmndq\" (UniqueName: \"kubernetes.io/projected/d9e41eee-4655-4cc2-b01d-37d1f947011b-kube-api-access-fmndq\") pod \"ceilometer-0\" (UID: \"d9e41eee-4655-4cc2-b01d-37d1f947011b\") " pod="openstack/ceilometer-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.481739 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cbgs\" (UniqueName: \"kubernetes.io/projected/3308cda8-c038-4fbc-91ad-824ce2c1d85c-kube-api-access-2cbgs\") pod \"dnsmasq-dns-8b5c85b87-8gc5c\" (UID: \"3308cda8-c038-4fbc-91ad-824ce2c1d85c\") " pod="openstack/dnsmasq-dns-8b5c85b87-8gc5c" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.481756 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3308cda8-c038-4fbc-91ad-824ce2c1d85c-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-8gc5c\" (UID: \"3308cda8-c038-4fbc-91ad-824ce2c1d85c\") " pod="openstack/dnsmasq-dns-8b5c85b87-8gc5c" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.481769 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9e41eee-4655-4cc2-b01d-37d1f947011b-run-httpd\") pod \"ceilometer-0\" (UID: \"d9e41eee-4655-4cc2-b01d-37d1f947011b\") " pod="openstack/ceilometer-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.481783 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b569b\" (UniqueName: \"kubernetes.io/projected/ce469a02-5678-42c1-84d7-21a29c1b3d18-kube-api-access-b569b\") pod \"placement-db-sync-f2ct5\" (UID: \"ce469a02-5678-42c1-84d7-21a29c1b3d18\") " pod="openstack/placement-db-sync-f2ct5" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.481801 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9e41eee-4655-4cc2-b01d-37d1f947011b-scripts\") pod \"ceilometer-0\" (UID: \"d9e41eee-4655-4cc2-b01d-37d1f947011b\") " pod="openstack/ceilometer-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.481814 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86fdf9e4-cf58-46e2-b541-2c03fda113c5-logs\") pod \"glance-default-external-api-0\" (UID: \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.481832 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr278\" (UniqueName: \"kubernetes.io/projected/86fdf9e4-cf58-46e2-b541-2c03fda113c5-kube-api-access-zr278\") pod \"glance-default-external-api-0\" (UID: \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.481867 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3308cda8-c038-4fbc-91ad-824ce2c1d85c-config\") pod \"dnsmasq-dns-8b5c85b87-8gc5c\" (UID: \"3308cda8-c038-4fbc-91ad-824ce2c1d85c\") " pod="openstack/dnsmasq-dns-8b5c85b87-8gc5c" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.481890 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86fdf9e4-cf58-46e2-b541-2c03fda113c5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.481929 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3308cda8-c038-4fbc-91ad-824ce2c1d85c-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-8gc5c\" (UID: \"3308cda8-c038-4fbc-91ad-824ce2c1d85c\") " pod="openstack/dnsmasq-dns-8b5c85b87-8gc5c" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.481945 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce469a02-5678-42c1-84d7-21a29c1b3d18-config-data\") pod \"placement-db-sync-f2ct5\" (UID: \"ce469a02-5678-42c1-84d7-21a29c1b3d18\") " pod="openstack/placement-db-sync-f2ct5" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.481965 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86fdf9e4-cf58-46e2-b541-2c03fda113c5-scripts\") pod \"glance-default-external-api-0\" (UID: \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.481983 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9e41eee-4655-4cc2-b01d-37d1f947011b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d9e41eee-4655-4cc2-b01d-37d1f947011b\") " pod="openstack/ceilometer-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.486024 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9e41eee-4655-4cc2-b01d-37d1f947011b-log-httpd\") pod \"ceilometer-0\" (UID: \"d9e41eee-4655-4cc2-b01d-37d1f947011b\") " pod="openstack/ceilometer-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.486150 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9e41eee-4655-4cc2-b01d-37d1f947011b-run-httpd\") pod \"ceilometer-0\" (UID: \"d9e41eee-4655-4cc2-b01d-37d1f947011b\") " pod="openstack/ceilometer-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.510771 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9e41eee-4655-4cc2-b01d-37d1f947011b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d9e41eee-4655-4cc2-b01d-37d1f947011b\") " pod="openstack/ceilometer-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.512030 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9e41eee-4655-4cc2-b01d-37d1f947011b-scripts\") pod \"ceilometer-0\" (UID: \"d9e41eee-4655-4cc2-b01d-37d1f947011b\") " pod="openstack/ceilometer-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.513845 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9e41eee-4655-4cc2-b01d-37d1f947011b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d9e41eee-4655-4cc2-b01d-37d1f947011b\") " pod="openstack/ceilometer-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.527231 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9e41eee-4655-4cc2-b01d-37d1f947011b-config-data\") pod \"ceilometer-0\" (UID: \"d9e41eee-4655-4cc2-b01d-37d1f947011b\") " pod="openstack/ceilometer-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.542986 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmndq\" (UniqueName: \"kubernetes.io/projected/d9e41eee-4655-4cc2-b01d-37d1f947011b-kube-api-access-fmndq\") pod \"ceilometer-0\" (UID: \"d9e41eee-4655-4cc2-b01d-37d1f947011b\") " pod="openstack/ceilometer-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.585254 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc47f9df-51ec-4aad-861b-c04b1321c5a3-scripts\") pod \"horizon-7c54b8bfd5-ftr47\" (UID: \"fc47f9df-51ec-4aad-861b-c04b1321c5a3\") " pod="openstack/horizon-7c54b8bfd5-ftr47" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.585313 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce469a02-5678-42c1-84d7-21a29c1b3d18-scripts\") pod \"placement-db-sync-f2ct5\" (UID: \"ce469a02-5678-42c1-84d7-21a29c1b3d18\") " pod="openstack/placement-db-sync-f2ct5" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.585340 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce469a02-5678-42c1-84d7-21a29c1b3d18-combined-ca-bundle\") pod \"placement-db-sync-f2ct5\" (UID: \"ce469a02-5678-42c1-84d7-21a29c1b3d18\") " pod="openstack/placement-db-sync-f2ct5" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.585361 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a3cfed0-4f99-417c-830b-54217f4bad49-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8a3cfed0-4f99-417c-830b-54217f4bad49\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.585403 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3308cda8-c038-4fbc-91ad-824ce2c1d85c-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-8gc5c\" (UID: \"3308cda8-c038-4fbc-91ad-824ce2c1d85c\") " pod="openstack/dnsmasq-dns-8b5c85b87-8gc5c" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.585420 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc47f9df-51ec-4aad-861b-c04b1321c5a3-logs\") pod \"horizon-7c54b8bfd5-ftr47\" (UID: \"fc47f9df-51ec-4aad-861b-c04b1321c5a3\") " pod="openstack/horizon-7c54b8bfd5-ftr47" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.585440 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"8a3cfed0-4f99-417c-830b-54217f4bad49\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.585467 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.585488 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce469a02-5678-42c1-84d7-21a29c1b3d18-logs\") pod \"placement-db-sync-f2ct5\" (UID: \"ce469a02-5678-42c1-84d7-21a29c1b3d18\") " pod="openstack/placement-db-sync-f2ct5" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.585506 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/86fdf9e4-cf58-46e2-b541-2c03fda113c5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.585527 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a3cfed0-4f99-417c-830b-54217f4bad49-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8a3cfed0-4f99-417c-830b-54217f4bad49\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.585547 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86fdf9e4-cf58-46e2-b541-2c03fda113c5-config-data\") pod \"glance-default-external-api-0\" (UID: \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.585564 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3308cda8-c038-4fbc-91ad-824ce2c1d85c-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-8gc5c\" (UID: \"3308cda8-c038-4fbc-91ad-824ce2c1d85c\") " pod="openstack/dnsmasq-dns-8b5c85b87-8gc5c" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.585582 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a3cfed0-4f99-417c-830b-54217f4bad49-logs\") pod \"glance-default-internal-api-0\" (UID: \"8a3cfed0-4f99-417c-830b-54217f4bad49\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.585596 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a3cfed0-4f99-417c-830b-54217f4bad49-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8a3cfed0-4f99-417c-830b-54217f4bad49\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.585616 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc47f9df-51ec-4aad-861b-c04b1321c5a3-config-data\") pod \"horizon-7c54b8bfd5-ftr47\" (UID: \"fc47f9df-51ec-4aad-861b-c04b1321c5a3\") " pod="openstack/horizon-7c54b8bfd5-ftr47" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.585636 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cbgs\" (UniqueName: \"kubernetes.io/projected/3308cda8-c038-4fbc-91ad-824ce2c1d85c-kube-api-access-2cbgs\") pod \"dnsmasq-dns-8b5c85b87-8gc5c\" (UID: \"3308cda8-c038-4fbc-91ad-824ce2c1d85c\") " pod="openstack/dnsmasq-dns-8b5c85b87-8gc5c" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.585652 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b569b\" (UniqueName: \"kubernetes.io/projected/ce469a02-5678-42c1-84d7-21a29c1b3d18-kube-api-access-b569b\") pod \"placement-db-sync-f2ct5\" (UID: \"ce469a02-5678-42c1-84d7-21a29c1b3d18\") " pod="openstack/placement-db-sync-f2ct5" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.585666 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3308cda8-c038-4fbc-91ad-824ce2c1d85c-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-8gc5c\" (UID: \"3308cda8-c038-4fbc-91ad-824ce2c1d85c\") " pod="openstack/dnsmasq-dns-8b5c85b87-8gc5c" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.585681 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86fdf9e4-cf58-46e2-b541-2c03fda113c5-logs\") pod \"glance-default-external-api-0\" (UID: \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.585700 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr278\" (UniqueName: \"kubernetes.io/projected/86fdf9e4-cf58-46e2-b541-2c03fda113c5-kube-api-access-zr278\") pod \"glance-default-external-api-0\" (UID: \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.585731 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpcdb\" (UniqueName: \"kubernetes.io/projected/8a3cfed0-4f99-417c-830b-54217f4bad49-kube-api-access-xpcdb\") pod \"glance-default-internal-api-0\" (UID: \"8a3cfed0-4f99-417c-830b-54217f4bad49\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.585751 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a3cfed0-4f99-417c-830b-54217f4bad49-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8a3cfed0-4f99-417c-830b-54217f4bad49\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.585779 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3308cda8-c038-4fbc-91ad-824ce2c1d85c-config\") pod \"dnsmasq-dns-8b5c85b87-8gc5c\" (UID: \"3308cda8-c038-4fbc-91ad-824ce2c1d85c\") " pod="openstack/dnsmasq-dns-8b5c85b87-8gc5c" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.585800 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86fdf9e4-cf58-46e2-b541-2c03fda113c5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.585818 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fc47f9df-51ec-4aad-861b-c04b1321c5a3-horizon-secret-key\") pod \"horizon-7c54b8bfd5-ftr47\" (UID: \"fc47f9df-51ec-4aad-861b-c04b1321c5a3\") " pod="openstack/horizon-7c54b8bfd5-ftr47" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.585838 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5j2h\" (UniqueName: \"kubernetes.io/projected/fc47f9df-51ec-4aad-861b-c04b1321c5a3-kube-api-access-j5j2h\") pod \"horizon-7c54b8bfd5-ftr47\" (UID: \"fc47f9df-51ec-4aad-861b-c04b1321c5a3\") " pod="openstack/horizon-7c54b8bfd5-ftr47" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.585861 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3308cda8-c038-4fbc-91ad-824ce2c1d85c-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-8gc5c\" (UID: \"3308cda8-c038-4fbc-91ad-824ce2c1d85c\") " pod="openstack/dnsmasq-dns-8b5c85b87-8gc5c" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.585877 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce469a02-5678-42c1-84d7-21a29c1b3d18-config-data\") pod \"placement-db-sync-f2ct5\" (UID: \"ce469a02-5678-42c1-84d7-21a29c1b3d18\") " pod="openstack/placement-db-sync-f2ct5" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.585898 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86fdf9e4-cf58-46e2-b541-2c03fda113c5-scripts\") pod \"glance-default-external-api-0\" (UID: \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.595012 4774 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.602099 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce469a02-5678-42c1-84d7-21a29c1b3d18-scripts\") pod \"placement-db-sync-f2ct5\" (UID: \"ce469a02-5678-42c1-84d7-21a29c1b3d18\") " pod="openstack/placement-db-sync-f2ct5" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.602436 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce469a02-5678-42c1-84d7-21a29c1b3d18-logs\") pod \"placement-db-sync-f2ct5\" (UID: \"ce469a02-5678-42c1-84d7-21a29c1b3d18\") " pod="openstack/placement-db-sync-f2ct5" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.602717 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/86fdf9e4-cf58-46e2-b541-2c03fda113c5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.605484 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3308cda8-c038-4fbc-91ad-824ce2c1d85c-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-8gc5c\" (UID: \"3308cda8-c038-4fbc-91ad-824ce2c1d85c\") " pod="openstack/dnsmasq-dns-8b5c85b87-8gc5c" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.607169 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3308cda8-c038-4fbc-91ad-824ce2c1d85c-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-8gc5c\" (UID: \"3308cda8-c038-4fbc-91ad-824ce2c1d85c\") " pod="openstack/dnsmasq-dns-8b5c85b87-8gc5c" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.608015 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86fdf9e4-cf58-46e2-b541-2c03fda113c5-logs\") pod \"glance-default-external-api-0\" (UID: \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.612167 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86fdf9e4-cf58-46e2-b541-2c03fda113c5-scripts\") pod \"glance-default-external-api-0\" (UID: \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.613535 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3308cda8-c038-4fbc-91ad-824ce2c1d85c-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-8gc5c\" (UID: \"3308cda8-c038-4fbc-91ad-824ce2c1d85c\") " pod="openstack/dnsmasq-dns-8b5c85b87-8gc5c" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.614886 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.657519 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3308cda8-c038-4fbc-91ad-824ce2c1d85c-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-8gc5c\" (UID: \"3308cda8-c038-4fbc-91ad-824ce2c1d85c\") " pod="openstack/dnsmasq-dns-8b5c85b87-8gc5c" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.658197 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3308cda8-c038-4fbc-91ad-824ce2c1d85c-config\") pod \"dnsmasq-dns-8b5c85b87-8gc5c\" (UID: \"3308cda8-c038-4fbc-91ad-824ce2c1d85c\") " pod="openstack/dnsmasq-dns-8b5c85b87-8gc5c" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.659626 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce469a02-5678-42c1-84d7-21a29c1b3d18-config-data\") pod \"placement-db-sync-f2ct5\" (UID: \"ce469a02-5678-42c1-84d7-21a29c1b3d18\") " pod="openstack/placement-db-sync-f2ct5" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.666013 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce469a02-5678-42c1-84d7-21a29c1b3d18-combined-ca-bundle\") pod \"placement-db-sync-f2ct5\" (UID: \"ce469a02-5678-42c1-84d7-21a29c1b3d18\") " pod="openstack/placement-db-sync-f2ct5" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.669616 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86fdf9e4-cf58-46e2-b541-2c03fda113c5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.683479 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cbgs\" (UniqueName: \"kubernetes.io/projected/3308cda8-c038-4fbc-91ad-824ce2c1d85c-kube-api-access-2cbgs\") pod \"dnsmasq-dns-8b5c85b87-8gc5c\" (UID: \"3308cda8-c038-4fbc-91ad-824ce2c1d85c\") " pod="openstack/dnsmasq-dns-8b5c85b87-8gc5c" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.687238 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr278\" (UniqueName: \"kubernetes.io/projected/86fdf9e4-cf58-46e2-b541-2c03fda113c5-kube-api-access-zr278\") pod \"glance-default-external-api-0\" (UID: \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.688422 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc47f9df-51ec-4aad-861b-c04b1321c5a3-scripts\") pod \"horizon-7c54b8bfd5-ftr47\" (UID: \"fc47f9df-51ec-4aad-861b-c04b1321c5a3\") " pod="openstack/horizon-7c54b8bfd5-ftr47" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.688490 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a3cfed0-4f99-417c-830b-54217f4bad49-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8a3cfed0-4f99-417c-830b-54217f4bad49\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.688526 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc47f9df-51ec-4aad-861b-c04b1321c5a3-logs\") pod \"horizon-7c54b8bfd5-ftr47\" (UID: \"fc47f9df-51ec-4aad-861b-c04b1321c5a3\") " pod="openstack/horizon-7c54b8bfd5-ftr47" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.688550 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"8a3cfed0-4f99-417c-830b-54217f4bad49\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.688596 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a3cfed0-4f99-417c-830b-54217f4bad49-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8a3cfed0-4f99-417c-830b-54217f4bad49\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.688628 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a3cfed0-4f99-417c-830b-54217f4bad49-logs\") pod \"glance-default-internal-api-0\" (UID: \"8a3cfed0-4f99-417c-830b-54217f4bad49\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.688647 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a3cfed0-4f99-417c-830b-54217f4bad49-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8a3cfed0-4f99-417c-830b-54217f4bad49\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.688671 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc47f9df-51ec-4aad-861b-c04b1321c5a3-config-data\") pod \"horizon-7c54b8bfd5-ftr47\" (UID: \"fc47f9df-51ec-4aad-861b-c04b1321c5a3\") " pod="openstack/horizon-7c54b8bfd5-ftr47" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.688716 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpcdb\" (UniqueName: \"kubernetes.io/projected/8a3cfed0-4f99-417c-830b-54217f4bad49-kube-api-access-xpcdb\") pod \"glance-default-internal-api-0\" (UID: \"8a3cfed0-4f99-417c-830b-54217f4bad49\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.688735 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a3cfed0-4f99-417c-830b-54217f4bad49-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8a3cfed0-4f99-417c-830b-54217f4bad49\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.688779 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fc47f9df-51ec-4aad-861b-c04b1321c5a3-horizon-secret-key\") pod \"horizon-7c54b8bfd5-ftr47\" (UID: \"fc47f9df-51ec-4aad-861b-c04b1321c5a3\") " pod="openstack/horizon-7c54b8bfd5-ftr47" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.688807 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5j2h\" (UniqueName: \"kubernetes.io/projected/fc47f9df-51ec-4aad-861b-c04b1321c5a3-kube-api-access-j5j2h\") pod \"horizon-7c54b8bfd5-ftr47\" (UID: \"fc47f9df-51ec-4aad-861b-c04b1321c5a3\") " pod="openstack/horizon-7c54b8bfd5-ftr47" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.688808 4774 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"8a3cfed0-4f99-417c-830b-54217f4bad49\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.689826 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc47f9df-51ec-4aad-861b-c04b1321c5a3-config-data\") pod \"horizon-7c54b8bfd5-ftr47\" (UID: \"fc47f9df-51ec-4aad-861b-c04b1321c5a3\") " pod="openstack/horizon-7c54b8bfd5-ftr47" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.708940 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a3cfed0-4f99-417c-830b-54217f4bad49-logs\") pod \"glance-default-internal-api-0\" (UID: \"8a3cfed0-4f99-417c-830b-54217f4bad49\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.710031 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc47f9df-51ec-4aad-861b-c04b1321c5a3-scripts\") pod \"horizon-7c54b8bfd5-ftr47\" (UID: \"fc47f9df-51ec-4aad-861b-c04b1321c5a3\") " pod="openstack/horizon-7c54b8bfd5-ftr47" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.710761 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a3cfed0-4f99-417c-830b-54217f4bad49-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8a3cfed0-4f99-417c-830b-54217f4bad49\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.711944 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc47f9df-51ec-4aad-861b-c04b1321c5a3-logs\") pod \"horizon-7c54b8bfd5-ftr47\" (UID: \"fc47f9df-51ec-4aad-861b-c04b1321c5a3\") " pod="openstack/horizon-7c54b8bfd5-ftr47" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.737739 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-8gc5c" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.738566 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86fdf9e4-cf58-46e2-b541-2c03fda113c5-config-data\") pod \"glance-default-external-api-0\" (UID: \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.749086 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a3cfed0-4f99-417c-830b-54217f4bad49-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8a3cfed0-4f99-417c-830b-54217f4bad49\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.753472 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b569b\" (UniqueName: \"kubernetes.io/projected/ce469a02-5678-42c1-84d7-21a29c1b3d18-kube-api-access-b569b\") pod \"placement-db-sync-f2ct5\" (UID: \"ce469a02-5678-42c1-84d7-21a29c1b3d18\") " pod="openstack/placement-db-sync-f2ct5" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.754052 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fc47f9df-51ec-4aad-861b-c04b1321c5a3-horizon-secret-key\") pod \"horizon-7c54b8bfd5-ftr47\" (UID: \"fc47f9df-51ec-4aad-861b-c04b1321c5a3\") " pod="openstack/horizon-7c54b8bfd5-ftr47" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.757410 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a3cfed0-4f99-417c-830b-54217f4bad49-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8a3cfed0-4f99-417c-830b-54217f4bad49\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.764500 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a3cfed0-4f99-417c-830b-54217f4bad49-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8a3cfed0-4f99-417c-830b-54217f4bad49\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.769042 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpcdb\" (UniqueName: \"kubernetes.io/projected/8a3cfed0-4f99-417c-830b-54217f4bad49-kube-api-access-xpcdb\") pod \"glance-default-internal-api-0\" (UID: \"8a3cfed0-4f99-417c-830b-54217f4bad49\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.817193 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5j2h\" (UniqueName: \"kubernetes.io/projected/fc47f9df-51ec-4aad-861b-c04b1321c5a3-kube-api-access-j5j2h\") pod \"horizon-7c54b8bfd5-ftr47\" (UID: \"fc47f9df-51ec-4aad-861b-c04b1321c5a3\") " pod="openstack/horizon-7c54b8bfd5-ftr47" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.833616 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"8a3cfed0-4f99-417c-830b-54217f4bad49\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.837893 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:30 crc kubenswrapper[4774]: E1003 15:00:30.939139 4774 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Oct 03 15:00:30 crc kubenswrapper[4774]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/17498ea5-9a08-42d6-bf48-e2eeca875964/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 03 15:00:30 crc kubenswrapper[4774]: > podSandboxID="abe6e1165c487cd013a66deae7742c4422924d9c0917eecf12c589d1f0a232dc" Oct 03 15:00:30 crc kubenswrapper[4774]: E1003 15:00:30.939547 4774 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 03 15:00:30 crc kubenswrapper[4774]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n64h68fh95h595h67fh597hfch57ch68fh5ffh6hf4h689h659h569h65bh67bh65dh594h64h5d6hd8h5bfh9fh5c4h676h5cdh56h8bh569h664h645q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xznrr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7ff5475cc9-7kxbv_openstack(17498ea5-9a08-42d6-bf48-e2eeca875964): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/17498ea5-9a08-42d6-bf48-e2eeca875964/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 03 15:00:30 crc kubenswrapper[4774]: > logger="UnhandledError" Oct 03 15:00:30 crc kubenswrapper[4774]: E1003 15:00:30.942092 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/17498ea5-9a08-42d6-bf48-e2eeca875964/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-7ff5475cc9-7kxbv" podUID="17498ea5-9a08-42d6-bf48-e2eeca875964" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.951686 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-f2ct5" Oct 03 15:00:30 crc kubenswrapper[4774]: I1003 15:00:30.955044 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-22k8m"] Oct 03 15:00:30 crc kubenswrapper[4774]: W1003 15:00:30.981095 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6552c40b_eb16_484f_bcbb_064f92daeab8.slice/crio-515a7c928d10cfc8a5ce120cc81dc53d0b8c03e7ecbd26675f0198f2ce272232 WatchSource:0}: Error finding container 515a7c928d10cfc8a5ce120cc81dc53d0b8c03e7ecbd26675f0198f2ce272232: Status 404 returned error can't find the container with id 515a7c928d10cfc8a5ce120cc81dc53d0b8c03e7ecbd26675f0198f2ce272232 Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.014307 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.015358 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c54b8bfd5-ftr47" Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.016286 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.079851 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-sx6cl"] Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.089944 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-sx6cl"] Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.090052 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-sx6cl" Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.093590 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-6swnb" Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.094278 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.095023 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.203182 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/be96775d-6115-4c8e-8539-230de2424b0e-etc-machine-id\") pod \"cinder-db-sync-sx6cl\" (UID: \"be96775d-6115-4c8e-8539-230de2424b0e\") " pod="openstack/cinder-db-sync-sx6cl" Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.203548 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-972jg\" (UniqueName: \"kubernetes.io/projected/be96775d-6115-4c8e-8539-230de2424b0e-kube-api-access-972jg\") pod \"cinder-db-sync-sx6cl\" (UID: \"be96775d-6115-4c8e-8539-230de2424b0e\") " pod="openstack/cinder-db-sync-sx6cl" Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.203589 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be96775d-6115-4c8e-8539-230de2424b0e-scripts\") pod \"cinder-db-sync-sx6cl\" (UID: \"be96775d-6115-4c8e-8539-230de2424b0e\") " pod="openstack/cinder-db-sync-sx6cl" Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.203619 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/be96775d-6115-4c8e-8539-230de2424b0e-db-sync-config-data\") pod \"cinder-db-sync-sx6cl\" (UID: \"be96775d-6115-4c8e-8539-230de2424b0e\") " pod="openstack/cinder-db-sync-sx6cl" Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.203689 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be96775d-6115-4c8e-8539-230de2424b0e-combined-ca-bundle\") pod \"cinder-db-sync-sx6cl\" (UID: \"be96775d-6115-4c8e-8539-230de2424b0e\") " pod="openstack/cinder-db-sync-sx6cl" Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.203738 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be96775d-6115-4c8e-8539-230de2424b0e-config-data\") pod \"cinder-db-sync-sx6cl\" (UID: \"be96775d-6115-4c8e-8539-230de2424b0e\") " pod="openstack/cinder-db-sync-sx6cl" Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.286771 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-22k8m" event={"ID":"6552c40b-eb16-484f-bcbb-064f92daeab8","Type":"ContainerStarted","Data":"515a7c928d10cfc8a5ce120cc81dc53d0b8c03e7ecbd26675f0198f2ce272232"} Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.292838 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vtqvj"] Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.306105 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/be96775d-6115-4c8e-8539-230de2424b0e-db-sync-config-data\") pod \"cinder-db-sync-sx6cl\" (UID: \"be96775d-6115-4c8e-8539-230de2424b0e\") " pod="openstack/cinder-db-sync-sx6cl" Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.306685 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be96775d-6115-4c8e-8539-230de2424b0e-combined-ca-bundle\") pod \"cinder-db-sync-sx6cl\" (UID: \"be96775d-6115-4c8e-8539-230de2424b0e\") " pod="openstack/cinder-db-sync-sx6cl" Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.306905 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be96775d-6115-4c8e-8539-230de2424b0e-config-data\") pod \"cinder-db-sync-sx6cl\" (UID: \"be96775d-6115-4c8e-8539-230de2424b0e\") " pod="openstack/cinder-db-sync-sx6cl" Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.307031 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/be96775d-6115-4c8e-8539-230de2424b0e-etc-machine-id\") pod \"cinder-db-sync-sx6cl\" (UID: \"be96775d-6115-4c8e-8539-230de2424b0e\") " pod="openstack/cinder-db-sync-sx6cl" Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.307168 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-972jg\" (UniqueName: \"kubernetes.io/projected/be96775d-6115-4c8e-8539-230de2424b0e-kube-api-access-972jg\") pod \"cinder-db-sync-sx6cl\" (UID: \"be96775d-6115-4c8e-8539-230de2424b0e\") " pod="openstack/cinder-db-sync-sx6cl" Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.307309 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be96775d-6115-4c8e-8539-230de2424b0e-scripts\") pod \"cinder-db-sync-sx6cl\" (UID: \"be96775d-6115-4c8e-8539-230de2424b0e\") " pod="openstack/cinder-db-sync-sx6cl" Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.307890 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/be96775d-6115-4c8e-8539-230de2424b0e-etc-machine-id\") pod \"cinder-db-sync-sx6cl\" (UID: \"be96775d-6115-4c8e-8539-230de2424b0e\") " pod="openstack/cinder-db-sync-sx6cl" Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.317455 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be96775d-6115-4c8e-8539-230de2424b0e-scripts\") pod \"cinder-db-sync-sx6cl\" (UID: \"be96775d-6115-4c8e-8539-230de2424b0e\") " pod="openstack/cinder-db-sync-sx6cl" Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.319616 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/be96775d-6115-4c8e-8539-230de2424b0e-db-sync-config-data\") pod \"cinder-db-sync-sx6cl\" (UID: \"be96775d-6115-4c8e-8539-230de2424b0e\") " pod="openstack/cinder-db-sync-sx6cl" Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.319820 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be96775d-6115-4c8e-8539-230de2424b0e-combined-ca-bundle\") pod \"cinder-db-sync-sx6cl\" (UID: \"be96775d-6115-4c8e-8539-230de2424b0e\") " pod="openstack/cinder-db-sync-sx6cl" Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.322323 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be96775d-6115-4c8e-8539-230de2424b0e-config-data\") pod \"cinder-db-sync-sx6cl\" (UID: \"be96775d-6115-4c8e-8539-230de2424b0e\") " pod="openstack/cinder-db-sync-sx6cl" Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.351987 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-972jg\" (UniqueName: \"kubernetes.io/projected/be96775d-6115-4c8e-8539-230de2424b0e-kube-api-access-972jg\") pod \"cinder-db-sync-sx6cl\" (UID: \"be96775d-6115-4c8e-8539-230de2424b0e\") " pod="openstack/cinder-db-sync-sx6cl" Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.360179 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4790dde0-2293-42c3-b068-19ac0c89968f" path="/var/lib/kubelet/pods/4790dde0-2293-42c3-b068-19ac0c89968f/volumes" Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.360790 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-587669876f-dmzh9"] Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.419874 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-sx6cl" Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.492391 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-8gc5c"] Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.514788 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.614233 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-f2ct5"] Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.825165 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-7kxbv" Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.839321 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.914394 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c54b8bfd5-ftr47"] Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.925923 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17498ea5-9a08-42d6-bf48-e2eeca875964-ovsdbserver-sb\") pod \"17498ea5-9a08-42d6-bf48-e2eeca875964\" (UID: \"17498ea5-9a08-42d6-bf48-e2eeca875964\") " Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.926145 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17498ea5-9a08-42d6-bf48-e2eeca875964-config\") pod \"17498ea5-9a08-42d6-bf48-e2eeca875964\" (UID: \"17498ea5-9a08-42d6-bf48-e2eeca875964\") " Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.926174 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17498ea5-9a08-42d6-bf48-e2eeca875964-dns-svc\") pod \"17498ea5-9a08-42d6-bf48-e2eeca875964\" (UID: \"17498ea5-9a08-42d6-bf48-e2eeca875964\") " Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.926208 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17498ea5-9a08-42d6-bf48-e2eeca875964-dns-swift-storage-0\") pod \"17498ea5-9a08-42d6-bf48-e2eeca875964\" (UID: \"17498ea5-9a08-42d6-bf48-e2eeca875964\") " Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.926267 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17498ea5-9a08-42d6-bf48-e2eeca875964-ovsdbserver-nb\") pod \"17498ea5-9a08-42d6-bf48-e2eeca875964\" (UID: \"17498ea5-9a08-42d6-bf48-e2eeca875964\") " Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.926351 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xznrr\" (UniqueName: \"kubernetes.io/projected/17498ea5-9a08-42d6-bf48-e2eeca875964-kube-api-access-xznrr\") pod \"17498ea5-9a08-42d6-bf48-e2eeca875964\" (UID: \"17498ea5-9a08-42d6-bf48-e2eeca875964\") " Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.936693 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17498ea5-9a08-42d6-bf48-e2eeca875964-kube-api-access-xznrr" (OuterVolumeSpecName: "kube-api-access-xznrr") pod "17498ea5-9a08-42d6-bf48-e2eeca875964" (UID: "17498ea5-9a08-42d6-bf48-e2eeca875964"). InnerVolumeSpecName "kube-api-access-xznrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:00:31 crc kubenswrapper[4774]: I1003 15:00:31.987762 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17498ea5-9a08-42d6-bf48-e2eeca875964-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "17498ea5-9a08-42d6-bf48-e2eeca875964" (UID: "17498ea5-9a08-42d6-bf48-e2eeca875964"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.011001 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.012429 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17498ea5-9a08-42d6-bf48-e2eeca875964-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "17498ea5-9a08-42d6-bf48-e2eeca875964" (UID: "17498ea5-9a08-42d6-bf48-e2eeca875964"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.026809 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17498ea5-9a08-42d6-bf48-e2eeca875964-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "17498ea5-9a08-42d6-bf48-e2eeca875964" (UID: "17498ea5-9a08-42d6-bf48-e2eeca875964"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.029002 4774 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17498ea5-9a08-42d6-bf48-e2eeca875964-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.029049 4774 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17498ea5-9a08-42d6-bf48-e2eeca875964-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.029065 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xznrr\" (UniqueName: \"kubernetes.io/projected/17498ea5-9a08-42d6-bf48-e2eeca875964-kube-api-access-xznrr\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.029078 4774 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17498ea5-9a08-42d6-bf48-e2eeca875964-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.031333 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17498ea5-9a08-42d6-bf48-e2eeca875964-config" (OuterVolumeSpecName: "config") pod "17498ea5-9a08-42d6-bf48-e2eeca875964" (UID: "17498ea5-9a08-42d6-bf48-e2eeca875964"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.038325 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17498ea5-9a08-42d6-bf48-e2eeca875964-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "17498ea5-9a08-42d6-bf48-e2eeca875964" (UID: "17498ea5-9a08-42d6-bf48-e2eeca875964"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.043516 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-sx6cl"] Oct 03 15:00:32 crc kubenswrapper[4774]: W1003 15:00:32.067565 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe96775d_6115_4c8e_8539_230de2424b0e.slice/crio-8d38e8df35eb5aa4bf7257373ce1154d0af8bdfde0cd1c44302f4b29d61d7f40 WatchSource:0}: Error finding container 8d38e8df35eb5aa4bf7257373ce1154d0af8bdfde0cd1c44302f4b29d61d7f40: Status 404 returned error can't find the container with id 8d38e8df35eb5aa4bf7257373ce1154d0af8bdfde0cd1c44302f4b29d61d7f40 Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.134467 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17498ea5-9a08-42d6-bf48-e2eeca875964-config\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.134499 4774 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17498ea5-9a08-42d6-bf48-e2eeca875964-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.312283 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"86fdf9e4-cf58-46e2-b541-2c03fda113c5","Type":"ContainerStarted","Data":"5fb29dc367a9e6c1b221a426502d3dfe1b9af3d71a3926115bf8b935443fcb35"} Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.314006 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9e41eee-4655-4cc2-b01d-37d1f947011b","Type":"ContainerStarted","Data":"74d32d7dea157a878329b28698540bb86ee7abbfda097b29ed0ec979532f70b3"} Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.315673 4774 generic.go:334] "Generic (PLEG): container finished" podID="3308cda8-c038-4fbc-91ad-824ce2c1d85c" containerID="928acf2f57bdfddefacdbe8fb2150cfeeb1911e55dd2a7b5d1b38200ad166808" exitCode=0 Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.315908 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-8gc5c" event={"ID":"3308cda8-c038-4fbc-91ad-824ce2c1d85c","Type":"ContainerDied","Data":"928acf2f57bdfddefacdbe8fb2150cfeeb1911e55dd2a7b5d1b38200ad166808"} Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.315991 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-8gc5c" event={"ID":"3308cda8-c038-4fbc-91ad-824ce2c1d85c","Type":"ContainerStarted","Data":"87eaabb98753bc48d382234dd58cd7845b50f26fbc2a03b5aff1016cbf2a8ada"} Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.317691 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-f2ct5" event={"ID":"ce469a02-5678-42c1-84d7-21a29c1b3d18","Type":"ContainerStarted","Data":"7de213dd285b152b26d64cd6f97bcc2ea847e6c2e4f2a2a72cb7d0aebda99d89"} Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.325237 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a3cfed0-4f99-417c-830b-54217f4bad49","Type":"ContainerStarted","Data":"9d30bb8a03e141a900f3a5c07ea12f2023a9968f161d75aff7f71fcf56c0e623"} Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.334310 4774 generic.go:334] "Generic (PLEG): container finished" podID="6552c40b-eb16-484f-bcbb-064f92daeab8" containerID="4cf4a3a7f245e37ccbe33ab7d379be239de375b63166a7107b9472f539ffeb41" exitCode=0 Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.334391 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-22k8m" event={"ID":"6552c40b-eb16-484f-bcbb-064f92daeab8","Type":"ContainerDied","Data":"4cf4a3a7f245e37ccbe33ab7d379be239de375b63166a7107b9472f539ffeb41"} Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.342344 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-587669876f-dmzh9" event={"ID":"b635609a-ab1c-4691-be86-da83abc3e663","Type":"ContainerStarted","Data":"079f3fb4b4b86f3c8e890f020e1736f355967a408b5692525430b0ad3971aaeb"} Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.356305 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-sx6cl" event={"ID":"be96775d-6115-4c8e-8539-230de2424b0e","Type":"ContainerStarted","Data":"8d38e8df35eb5aa4bf7257373ce1154d0af8bdfde0cd1c44302f4b29d61d7f40"} Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.368843 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-7kxbv" event={"ID":"17498ea5-9a08-42d6-bf48-e2eeca875964","Type":"ContainerDied","Data":"abe6e1165c487cd013a66deae7742c4422924d9c0917eecf12c589d1f0a232dc"} Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.368892 4774 scope.go:117] "RemoveContainer" containerID="18794906c28148ff0626e9311eb8397e2206e5896bdb5e925fdbea8d99d8b334" Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.368911 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-7kxbv" Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.371776 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c54b8bfd5-ftr47" event={"ID":"fc47f9df-51ec-4aad-861b-c04b1321c5a3","Type":"ContainerStarted","Data":"6b0f81373fdb8f9647deae4a695905b55eded21f10c5441e988626234a4c1453"} Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.377535 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vtqvj" event={"ID":"f303caa2-7916-4bdc-ba01-b5c55166dc53","Type":"ContainerStarted","Data":"93db5edd6b278dc571e695d72ddf2d8458116f67d7c550c2f036184cacbf1e4d"} Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.377564 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vtqvj" event={"ID":"f303caa2-7916-4bdc-ba01-b5c55166dc53","Type":"ContainerStarted","Data":"9b0924bb85f439a17915bd2ac19c49c3804b210b914bc4bca6d54da4b65c8934"} Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.404321 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-vtqvj" podStartSLOduration=3.404304187 podStartE2EDuration="3.404304187s" podCreationTimestamp="2025-10-03 15:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:00:32.398175874 +0000 UTC m=+1054.987379426" watchObservedRunningTime="2025-10-03 15:00:32.404304187 +0000 UTC m=+1054.993507639" Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.450203 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-7kxbv"] Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.464947 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-7kxbv"] Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.870913 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-22k8m" Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.970505 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6552c40b-eb16-484f-bcbb-064f92daeab8-ovsdbserver-nb\") pod \"6552c40b-eb16-484f-bcbb-064f92daeab8\" (UID: \"6552c40b-eb16-484f-bcbb-064f92daeab8\") " Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.970576 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6552c40b-eb16-484f-bcbb-064f92daeab8-dns-swift-storage-0\") pod \"6552c40b-eb16-484f-bcbb-064f92daeab8\" (UID: \"6552c40b-eb16-484f-bcbb-064f92daeab8\") " Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.970637 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6552c40b-eb16-484f-bcbb-064f92daeab8-ovsdbserver-sb\") pod \"6552c40b-eb16-484f-bcbb-064f92daeab8\" (UID: \"6552c40b-eb16-484f-bcbb-064f92daeab8\") " Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.970658 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6552c40b-eb16-484f-bcbb-064f92daeab8-dns-svc\") pod \"6552c40b-eb16-484f-bcbb-064f92daeab8\" (UID: \"6552c40b-eb16-484f-bcbb-064f92daeab8\") " Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.970687 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6552c40b-eb16-484f-bcbb-064f92daeab8-config\") pod \"6552c40b-eb16-484f-bcbb-064f92daeab8\" (UID: \"6552c40b-eb16-484f-bcbb-064f92daeab8\") " Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.970729 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llrh4\" (UniqueName: \"kubernetes.io/projected/6552c40b-eb16-484f-bcbb-064f92daeab8-kube-api-access-llrh4\") pod \"6552c40b-eb16-484f-bcbb-064f92daeab8\" (UID: \"6552c40b-eb16-484f-bcbb-064f92daeab8\") " Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.976437 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6552c40b-eb16-484f-bcbb-064f92daeab8-kube-api-access-llrh4" (OuterVolumeSpecName: "kube-api-access-llrh4") pod "6552c40b-eb16-484f-bcbb-064f92daeab8" (UID: "6552c40b-eb16-484f-bcbb-064f92daeab8"). InnerVolumeSpecName "kube-api-access-llrh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.991581 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6552c40b-eb16-484f-bcbb-064f92daeab8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6552c40b-eb16-484f-bcbb-064f92daeab8" (UID: "6552c40b-eb16-484f-bcbb-064f92daeab8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:00:32 crc kubenswrapper[4774]: I1003 15:00:32.996507 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6552c40b-eb16-484f-bcbb-064f92daeab8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6552c40b-eb16-484f-bcbb-064f92daeab8" (UID: "6552c40b-eb16-484f-bcbb-064f92daeab8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.002945 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6552c40b-eb16-484f-bcbb-064f92daeab8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6552c40b-eb16-484f-bcbb-064f92daeab8" (UID: "6552c40b-eb16-484f-bcbb-064f92daeab8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.004613 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6552c40b-eb16-484f-bcbb-064f92daeab8-config" (OuterVolumeSpecName: "config") pod "6552c40b-eb16-484f-bcbb-064f92daeab8" (UID: "6552c40b-eb16-484f-bcbb-064f92daeab8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.006212 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6552c40b-eb16-484f-bcbb-064f92daeab8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6552c40b-eb16-484f-bcbb-064f92daeab8" (UID: "6552c40b-eb16-484f-bcbb-064f92daeab8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.079054 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llrh4\" (UniqueName: \"kubernetes.io/projected/6552c40b-eb16-484f-bcbb-064f92daeab8-kube-api-access-llrh4\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.079102 4774 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6552c40b-eb16-484f-bcbb-064f92daeab8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.079114 4774 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6552c40b-eb16-484f-bcbb-064f92daeab8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.079126 4774 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6552c40b-eb16-484f-bcbb-064f92daeab8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.079163 4774 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6552c40b-eb16-484f-bcbb-064f92daeab8-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.079174 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6552c40b-eb16-484f-bcbb-064f92daeab8-config\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.144953 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.188604 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-587669876f-dmzh9"] Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.214874 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.265860 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-765f99df69-hhjpp"] Oct 03 15:00:33 crc kubenswrapper[4774]: E1003 15:00:33.266327 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17498ea5-9a08-42d6-bf48-e2eeca875964" containerName="init" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.266344 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="17498ea5-9a08-42d6-bf48-e2eeca875964" containerName="init" Oct 03 15:00:33 crc kubenswrapper[4774]: E1003 15:00:33.266405 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6552c40b-eb16-484f-bcbb-064f92daeab8" containerName="init" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.266415 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="6552c40b-eb16-484f-bcbb-064f92daeab8" containerName="init" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.291191 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="17498ea5-9a08-42d6-bf48-e2eeca875964" containerName="init" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.291265 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="6552c40b-eb16-484f-bcbb-064f92daeab8" containerName="init" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.292753 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-765f99df69-hhjpp" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.351756 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17498ea5-9a08-42d6-bf48-e2eeca875964" path="/var/lib/kubelet/pods/17498ea5-9a08-42d6-bf48-e2eeca875964/volumes" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.356134 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-765f99df69-hhjpp"] Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.356303 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.397067 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a-logs\") pod \"horizon-765f99df69-hhjpp\" (UID: \"e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a\") " pod="openstack/horizon-765f99df69-hhjpp" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.397141 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a-horizon-secret-key\") pod \"horizon-765f99df69-hhjpp\" (UID: \"e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a\") " pod="openstack/horizon-765f99df69-hhjpp" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.397180 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mrsb\" (UniqueName: \"kubernetes.io/projected/e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a-kube-api-access-4mrsb\") pod \"horizon-765f99df69-hhjpp\" (UID: \"e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a\") " pod="openstack/horizon-765f99df69-hhjpp" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.397215 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a-scripts\") pod \"horizon-765f99df69-hhjpp\" (UID: \"e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a\") " pod="openstack/horizon-765f99df69-hhjpp" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.397234 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a-config-data\") pod \"horizon-765f99df69-hhjpp\" (UID: \"e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a\") " pod="openstack/horizon-765f99df69-hhjpp" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.421894 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a3cfed0-4f99-417c-830b-54217f4bad49","Type":"ContainerStarted","Data":"42120485ed44e93ecb0a6580d47ee5ba1d2a7ce990cea659352a3d49a42540ea"} Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.433174 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"86fdf9e4-cf58-46e2-b541-2c03fda113c5","Type":"ContainerStarted","Data":"6616c8e3faf8b095b06438175cdf1a0ac16119db9528bea6127e766683cad29d"} Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.462014 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-8gc5c" event={"ID":"3308cda8-c038-4fbc-91ad-824ce2c1d85c","Type":"ContainerStarted","Data":"d3b421bd6d3316b3283e7c33022a26900e008c55af8794448be7869d582c2540"} Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.462286 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8b5c85b87-8gc5c" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.468359 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-22k8m" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.468600 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-22k8m" event={"ID":"6552c40b-eb16-484f-bcbb-064f92daeab8","Type":"ContainerDied","Data":"515a7c928d10cfc8a5ce120cc81dc53d0b8c03e7ecbd26675f0198f2ce272232"} Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.468643 4774 scope.go:117] "RemoveContainer" containerID="4cf4a3a7f245e37ccbe33ab7d379be239de375b63166a7107b9472f539ffeb41" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.485833 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8b5c85b87-8gc5c" podStartSLOduration=3.48580825 podStartE2EDuration="3.48580825s" podCreationTimestamp="2025-10-03 15:00:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:00:33.480915088 +0000 UTC m=+1056.070118540" watchObservedRunningTime="2025-10-03 15:00:33.48580825 +0000 UTC m=+1056.075011702" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.498571 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a-scripts\") pod \"horizon-765f99df69-hhjpp\" (UID: \"e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a\") " pod="openstack/horizon-765f99df69-hhjpp" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.498624 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a-config-data\") pod \"horizon-765f99df69-hhjpp\" (UID: \"e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a\") " pod="openstack/horizon-765f99df69-hhjpp" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.498732 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a-logs\") pod \"horizon-765f99df69-hhjpp\" (UID: \"e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a\") " pod="openstack/horizon-765f99df69-hhjpp" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.498973 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a-horizon-secret-key\") pod \"horizon-765f99df69-hhjpp\" (UID: \"e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a\") " pod="openstack/horizon-765f99df69-hhjpp" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.499014 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mrsb\" (UniqueName: \"kubernetes.io/projected/e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a-kube-api-access-4mrsb\") pod \"horizon-765f99df69-hhjpp\" (UID: \"e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a\") " pod="openstack/horizon-765f99df69-hhjpp" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.499493 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a-logs\") pod \"horizon-765f99df69-hhjpp\" (UID: \"e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a\") " pod="openstack/horizon-765f99df69-hhjpp" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.499686 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a-scripts\") pod \"horizon-765f99df69-hhjpp\" (UID: \"e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a\") " pod="openstack/horizon-765f99df69-hhjpp" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.500482 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a-config-data\") pod \"horizon-765f99df69-hhjpp\" (UID: \"e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a\") " pod="openstack/horizon-765f99df69-hhjpp" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.520974 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a-horizon-secret-key\") pod \"horizon-765f99df69-hhjpp\" (UID: \"e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a\") " pod="openstack/horizon-765f99df69-hhjpp" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.521770 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mrsb\" (UniqueName: \"kubernetes.io/projected/e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a-kube-api-access-4mrsb\") pod \"horizon-765f99df69-hhjpp\" (UID: \"e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a\") " pod="openstack/horizon-765f99df69-hhjpp" Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.538638 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-22k8m"] Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.545882 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-22k8m"] Oct 03 15:00:33 crc kubenswrapper[4774]: I1003 15:00:33.649525 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-765f99df69-hhjpp" Oct 03 15:00:34 crc kubenswrapper[4774]: I1003 15:00:34.180881 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-765f99df69-hhjpp"] Oct 03 15:00:34 crc kubenswrapper[4774]: I1003 15:00:34.478884 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"86fdf9e4-cf58-46e2-b541-2c03fda113c5","Type":"ContainerStarted","Data":"ceb81e14262147ea8dc69fb535a3c45b4247cf8388afc17bf35cb2cca414dba0"} Oct 03 15:00:34 crc kubenswrapper[4774]: I1003 15:00:34.479000 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="86fdf9e4-cf58-46e2-b541-2c03fda113c5" containerName="glance-log" containerID="cri-o://6616c8e3faf8b095b06438175cdf1a0ac16119db9528bea6127e766683cad29d" gracePeriod=30 Oct 03 15:00:34 crc kubenswrapper[4774]: I1003 15:00:34.479092 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="86fdf9e4-cf58-46e2-b541-2c03fda113c5" containerName="glance-httpd" containerID="cri-o://ceb81e14262147ea8dc69fb535a3c45b4247cf8388afc17bf35cb2cca414dba0" gracePeriod=30 Oct 03 15:00:34 crc kubenswrapper[4774]: I1003 15:00:34.486661 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a3cfed0-4f99-417c-830b-54217f4bad49","Type":"ContainerStarted","Data":"ec1218a393f5b0da2477da87608817b8eda9ddda25d8f55b89f13b0c495136c9"} Oct 03 15:00:34 crc kubenswrapper[4774]: I1003 15:00:34.486888 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8a3cfed0-4f99-417c-830b-54217f4bad49" containerName="glance-log" containerID="cri-o://42120485ed44e93ecb0a6580d47ee5ba1d2a7ce990cea659352a3d49a42540ea" gracePeriod=30 Oct 03 15:00:34 crc kubenswrapper[4774]: I1003 15:00:34.486909 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8a3cfed0-4f99-417c-830b-54217f4bad49" containerName="glance-httpd" containerID="cri-o://ec1218a393f5b0da2477da87608817b8eda9ddda25d8f55b89f13b0c495136c9" gracePeriod=30 Oct 03 15:00:34 crc kubenswrapper[4774]: I1003 15:00:34.489508 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-765f99df69-hhjpp" event={"ID":"e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a","Type":"ContainerStarted","Data":"b6edec92a6c97390450306ed5b9579cfdb4999e4c83cd44fef8162553648f1d3"} Oct 03 15:00:34 crc kubenswrapper[4774]: I1003 15:00:34.508123 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.508105658 podStartE2EDuration="4.508105658s" podCreationTimestamp="2025-10-03 15:00:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:00:34.502265762 +0000 UTC m=+1057.091469214" watchObservedRunningTime="2025-10-03 15:00:34.508105658 +0000 UTC m=+1057.097309120" Oct 03 15:00:34 crc kubenswrapper[4774]: I1003 15:00:34.529707 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.529678065 podStartE2EDuration="4.529678065s" podCreationTimestamp="2025-10-03 15:00:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:00:34.525503611 +0000 UTC m=+1057.114707053" watchObservedRunningTime="2025-10-03 15:00:34.529678065 +0000 UTC m=+1057.118881507" Oct 03 15:00:35 crc kubenswrapper[4774]: I1003 15:00:35.313544 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6552c40b-eb16-484f-bcbb-064f92daeab8" path="/var/lib/kubelet/pods/6552c40b-eb16-484f-bcbb-064f92daeab8/volumes" Oct 03 15:00:35 crc kubenswrapper[4774]: I1003 15:00:35.545411 4774 generic.go:334] "Generic (PLEG): container finished" podID="8a3cfed0-4f99-417c-830b-54217f4bad49" containerID="ec1218a393f5b0da2477da87608817b8eda9ddda25d8f55b89f13b0c495136c9" exitCode=0 Oct 03 15:00:35 crc kubenswrapper[4774]: I1003 15:00:35.545683 4774 generic.go:334] "Generic (PLEG): container finished" podID="8a3cfed0-4f99-417c-830b-54217f4bad49" containerID="42120485ed44e93ecb0a6580d47ee5ba1d2a7ce990cea659352a3d49a42540ea" exitCode=143 Oct 03 15:00:35 crc kubenswrapper[4774]: I1003 15:00:35.545500 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a3cfed0-4f99-417c-830b-54217f4bad49","Type":"ContainerDied","Data":"ec1218a393f5b0da2477da87608817b8eda9ddda25d8f55b89f13b0c495136c9"} Oct 03 15:00:35 crc kubenswrapper[4774]: I1003 15:00:35.545751 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a3cfed0-4f99-417c-830b-54217f4bad49","Type":"ContainerDied","Data":"42120485ed44e93ecb0a6580d47ee5ba1d2a7ce990cea659352a3d49a42540ea"} Oct 03 15:00:35 crc kubenswrapper[4774]: I1003 15:00:35.559735 4774 generic.go:334] "Generic (PLEG): container finished" podID="86fdf9e4-cf58-46e2-b541-2c03fda113c5" containerID="ceb81e14262147ea8dc69fb535a3c45b4247cf8388afc17bf35cb2cca414dba0" exitCode=0 Oct 03 15:00:35 crc kubenswrapper[4774]: I1003 15:00:35.559768 4774 generic.go:334] "Generic (PLEG): container finished" podID="86fdf9e4-cf58-46e2-b541-2c03fda113c5" containerID="6616c8e3faf8b095b06438175cdf1a0ac16119db9528bea6127e766683cad29d" exitCode=143 Oct 03 15:00:35 crc kubenswrapper[4774]: I1003 15:00:35.559791 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"86fdf9e4-cf58-46e2-b541-2c03fda113c5","Type":"ContainerDied","Data":"ceb81e14262147ea8dc69fb535a3c45b4247cf8388afc17bf35cb2cca414dba0"} Oct 03 15:00:35 crc kubenswrapper[4774]: I1003 15:00:35.559820 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"86fdf9e4-cf58-46e2-b541-2c03fda113c5","Type":"ContainerDied","Data":"6616c8e3faf8b095b06438175cdf1a0ac16119db9528bea6127e766683cad29d"} Oct 03 15:00:35 crc kubenswrapper[4774]: I1003 15:00:35.703944 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-2589-account-create-zksfs"] Oct 03 15:00:35 crc kubenswrapper[4774]: I1003 15:00:35.705224 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2589-account-create-zksfs" Oct 03 15:00:35 crc kubenswrapper[4774]: I1003 15:00:35.707884 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 03 15:00:35 crc kubenswrapper[4774]: I1003 15:00:35.720063 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-2589-account-create-zksfs"] Oct 03 15:00:35 crc kubenswrapper[4774]: I1003 15:00:35.759993 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47ndl\" (UniqueName: \"kubernetes.io/projected/fcf3e07f-94a6-4786-9ef9-a7cb9f1562b0-kube-api-access-47ndl\") pod \"barbican-2589-account-create-zksfs\" (UID: \"fcf3e07f-94a6-4786-9ef9-a7cb9f1562b0\") " pod="openstack/barbican-2589-account-create-zksfs" Oct 03 15:00:35 crc kubenswrapper[4774]: I1003 15:00:35.861625 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47ndl\" (UniqueName: \"kubernetes.io/projected/fcf3e07f-94a6-4786-9ef9-a7cb9f1562b0-kube-api-access-47ndl\") pod \"barbican-2589-account-create-zksfs\" (UID: \"fcf3e07f-94a6-4786-9ef9-a7cb9f1562b0\") " pod="openstack/barbican-2589-account-create-zksfs" Oct 03 15:00:35 crc kubenswrapper[4774]: I1003 15:00:35.897782 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47ndl\" (UniqueName: \"kubernetes.io/projected/fcf3e07f-94a6-4786-9ef9-a7cb9f1562b0-kube-api-access-47ndl\") pod \"barbican-2589-account-create-zksfs\" (UID: \"fcf3e07f-94a6-4786-9ef9-a7cb9f1562b0\") " pod="openstack/barbican-2589-account-create-zksfs" Oct 03 15:00:35 crc kubenswrapper[4774]: I1003 15:00:35.921481 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5fbc-account-create-czgkc"] Oct 03 15:00:35 crc kubenswrapper[4774]: I1003 15:00:35.922763 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5fbc-account-create-czgkc" Oct 03 15:00:35 crc kubenswrapper[4774]: I1003 15:00:35.926989 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 03 15:00:35 crc kubenswrapper[4774]: I1003 15:00:35.927231 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5fbc-account-create-czgkc"] Oct 03 15:00:36 crc kubenswrapper[4774]: I1003 15:00:36.031421 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2589-account-create-zksfs" Oct 03 15:00:36 crc kubenswrapper[4774]: I1003 15:00:36.064619 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg4m2\" (UniqueName: \"kubernetes.io/projected/066209ed-a6f1-4363-89d7-ad3a9b865341-kube-api-access-vg4m2\") pod \"neutron-5fbc-account-create-czgkc\" (UID: \"066209ed-a6f1-4363-89d7-ad3a9b865341\") " pod="openstack/neutron-5fbc-account-create-czgkc" Oct 03 15:00:36 crc kubenswrapper[4774]: I1003 15:00:36.166032 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg4m2\" (UniqueName: \"kubernetes.io/projected/066209ed-a6f1-4363-89d7-ad3a9b865341-kube-api-access-vg4m2\") pod \"neutron-5fbc-account-create-czgkc\" (UID: \"066209ed-a6f1-4363-89d7-ad3a9b865341\") " pod="openstack/neutron-5fbc-account-create-czgkc" Oct 03 15:00:36 crc kubenswrapper[4774]: I1003 15:00:36.182129 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg4m2\" (UniqueName: \"kubernetes.io/projected/066209ed-a6f1-4363-89d7-ad3a9b865341-kube-api-access-vg4m2\") pod \"neutron-5fbc-account-create-czgkc\" (UID: \"066209ed-a6f1-4363-89d7-ad3a9b865341\") " pod="openstack/neutron-5fbc-account-create-czgkc" Oct 03 15:00:36 crc kubenswrapper[4774]: I1003 15:00:36.253864 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5fbc-account-create-czgkc" Oct 03 15:00:36 crc kubenswrapper[4774]: I1003 15:00:36.578973 4774 generic.go:334] "Generic (PLEG): container finished" podID="f303caa2-7916-4bdc-ba01-b5c55166dc53" containerID="93db5edd6b278dc571e695d72ddf2d8458116f67d7c550c2f036184cacbf1e4d" exitCode=0 Oct 03 15:00:36 crc kubenswrapper[4774]: I1003 15:00:36.579061 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vtqvj" event={"ID":"f303caa2-7916-4bdc-ba01-b5c55166dc53","Type":"ContainerDied","Data":"93db5edd6b278dc571e695d72ddf2d8458116f67d7c550c2f036184cacbf1e4d"} Oct 03 15:00:40 crc kubenswrapper[4774]: I1003 15:00:40.739637 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8b5c85b87-8gc5c" Oct 03 15:00:40 crc kubenswrapper[4774]: I1003 15:00:40.806808 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-h54hg"] Oct 03 15:00:40 crc kubenswrapper[4774]: I1003 15:00:40.807100 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-h54hg" podUID="403b4516-d6b7-408a-9013-6789d9d99abf" containerName="dnsmasq-dns" containerID="cri-o://4c4b5a3f1c9d59a7a4e9565b18e824792f91d86a248402a2a146c1b168045fc3" gracePeriod=10 Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.248695 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c54b8bfd5-ftr47"] Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.280987 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-bc5bdf456-xt2x4"] Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.282966 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bc5bdf456-xt2x4" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.285985 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.314526 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-bc5bdf456-xt2x4"] Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.345194 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-765f99df69-hhjpp"] Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.382381 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/871f7d16-54b6-4aa9-8e99-00a888d41f70-combined-ca-bundle\") pod \"horizon-bc5bdf456-xt2x4\" (UID: \"871f7d16-54b6-4aa9-8e99-00a888d41f70\") " pod="openstack/horizon-bc5bdf456-xt2x4" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.382465 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjcr2\" (UniqueName: \"kubernetes.io/projected/871f7d16-54b6-4aa9-8e99-00a888d41f70-kube-api-access-xjcr2\") pod \"horizon-bc5bdf456-xt2x4\" (UID: \"871f7d16-54b6-4aa9-8e99-00a888d41f70\") " pod="openstack/horizon-bc5bdf456-xt2x4" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.382512 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/871f7d16-54b6-4aa9-8e99-00a888d41f70-logs\") pod \"horizon-bc5bdf456-xt2x4\" (UID: \"871f7d16-54b6-4aa9-8e99-00a888d41f70\") " pod="openstack/horizon-bc5bdf456-xt2x4" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.382532 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/871f7d16-54b6-4aa9-8e99-00a888d41f70-config-data\") pod \"horizon-bc5bdf456-xt2x4\" (UID: \"871f7d16-54b6-4aa9-8e99-00a888d41f70\") " pod="openstack/horizon-bc5bdf456-xt2x4" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.382589 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/871f7d16-54b6-4aa9-8e99-00a888d41f70-scripts\") pod \"horizon-bc5bdf456-xt2x4\" (UID: \"871f7d16-54b6-4aa9-8e99-00a888d41f70\") " pod="openstack/horizon-bc5bdf456-xt2x4" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.382605 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/871f7d16-54b6-4aa9-8e99-00a888d41f70-horizon-tls-certs\") pod \"horizon-bc5bdf456-xt2x4\" (UID: \"871f7d16-54b6-4aa9-8e99-00a888d41f70\") " pod="openstack/horizon-bc5bdf456-xt2x4" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.382623 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/871f7d16-54b6-4aa9-8e99-00a888d41f70-horizon-secret-key\") pod \"horizon-bc5bdf456-xt2x4\" (UID: \"871f7d16-54b6-4aa9-8e99-00a888d41f70\") " pod="openstack/horizon-bc5bdf456-xt2x4" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.387360 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-f865bb968-k9r7v"] Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.389238 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f865bb968-k9r7v" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.408119 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f865bb968-k9r7v"] Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.484506 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/871f7d16-54b6-4aa9-8e99-00a888d41f70-scripts\") pod \"horizon-bc5bdf456-xt2x4\" (UID: \"871f7d16-54b6-4aa9-8e99-00a888d41f70\") " pod="openstack/horizon-bc5bdf456-xt2x4" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.484562 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/871f7d16-54b6-4aa9-8e99-00a888d41f70-horizon-tls-certs\") pod \"horizon-bc5bdf456-xt2x4\" (UID: \"871f7d16-54b6-4aa9-8e99-00a888d41f70\") " pod="openstack/horizon-bc5bdf456-xt2x4" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.484590 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/871f7d16-54b6-4aa9-8e99-00a888d41f70-horizon-secret-key\") pod \"horizon-bc5bdf456-xt2x4\" (UID: \"871f7d16-54b6-4aa9-8e99-00a888d41f70\") " pod="openstack/horizon-bc5bdf456-xt2x4" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.484628 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szrxn\" (UniqueName: \"kubernetes.io/projected/c0b4826d-75e2-4023-8d53-3ddd0da5bc2e-kube-api-access-szrxn\") pod \"horizon-f865bb968-k9r7v\" (UID: \"c0b4826d-75e2-4023-8d53-3ddd0da5bc2e\") " pod="openstack/horizon-f865bb968-k9r7v" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.484688 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/871f7d16-54b6-4aa9-8e99-00a888d41f70-combined-ca-bundle\") pod \"horizon-bc5bdf456-xt2x4\" (UID: \"871f7d16-54b6-4aa9-8e99-00a888d41f70\") " pod="openstack/horizon-bc5bdf456-xt2x4" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.484738 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0b4826d-75e2-4023-8d53-3ddd0da5bc2e-logs\") pod \"horizon-f865bb968-k9r7v\" (UID: \"c0b4826d-75e2-4023-8d53-3ddd0da5bc2e\") " pod="openstack/horizon-f865bb968-k9r7v" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.484769 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0b4826d-75e2-4023-8d53-3ddd0da5bc2e-combined-ca-bundle\") pod \"horizon-f865bb968-k9r7v\" (UID: \"c0b4826d-75e2-4023-8d53-3ddd0da5bc2e\") " pod="openstack/horizon-f865bb968-k9r7v" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.484798 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjcr2\" (UniqueName: \"kubernetes.io/projected/871f7d16-54b6-4aa9-8e99-00a888d41f70-kube-api-access-xjcr2\") pod \"horizon-bc5bdf456-xt2x4\" (UID: \"871f7d16-54b6-4aa9-8e99-00a888d41f70\") " pod="openstack/horizon-bc5bdf456-xt2x4" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.484839 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0b4826d-75e2-4023-8d53-3ddd0da5bc2e-horizon-tls-certs\") pod \"horizon-f865bb968-k9r7v\" (UID: \"c0b4826d-75e2-4023-8d53-3ddd0da5bc2e\") " pod="openstack/horizon-f865bb968-k9r7v" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.484866 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/871f7d16-54b6-4aa9-8e99-00a888d41f70-logs\") pod \"horizon-bc5bdf456-xt2x4\" (UID: \"871f7d16-54b6-4aa9-8e99-00a888d41f70\") " pod="openstack/horizon-bc5bdf456-xt2x4" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.484891 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/871f7d16-54b6-4aa9-8e99-00a888d41f70-config-data\") pod \"horizon-bc5bdf456-xt2x4\" (UID: \"871f7d16-54b6-4aa9-8e99-00a888d41f70\") " pod="openstack/horizon-bc5bdf456-xt2x4" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.484918 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c0b4826d-75e2-4023-8d53-3ddd0da5bc2e-horizon-secret-key\") pod \"horizon-f865bb968-k9r7v\" (UID: \"c0b4826d-75e2-4023-8d53-3ddd0da5bc2e\") " pod="openstack/horizon-f865bb968-k9r7v" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.484947 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0b4826d-75e2-4023-8d53-3ddd0da5bc2e-config-data\") pod \"horizon-f865bb968-k9r7v\" (UID: \"c0b4826d-75e2-4023-8d53-3ddd0da5bc2e\") " pod="openstack/horizon-f865bb968-k9r7v" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.484989 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0b4826d-75e2-4023-8d53-3ddd0da5bc2e-scripts\") pod \"horizon-f865bb968-k9r7v\" (UID: \"c0b4826d-75e2-4023-8d53-3ddd0da5bc2e\") " pod="openstack/horizon-f865bb968-k9r7v" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.485225 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/871f7d16-54b6-4aa9-8e99-00a888d41f70-scripts\") pod \"horizon-bc5bdf456-xt2x4\" (UID: \"871f7d16-54b6-4aa9-8e99-00a888d41f70\") " pod="openstack/horizon-bc5bdf456-xt2x4" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.486137 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/871f7d16-54b6-4aa9-8e99-00a888d41f70-config-data\") pod \"horizon-bc5bdf456-xt2x4\" (UID: \"871f7d16-54b6-4aa9-8e99-00a888d41f70\") " pod="openstack/horizon-bc5bdf456-xt2x4" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.486392 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/871f7d16-54b6-4aa9-8e99-00a888d41f70-logs\") pod \"horizon-bc5bdf456-xt2x4\" (UID: \"871f7d16-54b6-4aa9-8e99-00a888d41f70\") " pod="openstack/horizon-bc5bdf456-xt2x4" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.492224 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/871f7d16-54b6-4aa9-8e99-00a888d41f70-horizon-tls-certs\") pod \"horizon-bc5bdf456-xt2x4\" (UID: \"871f7d16-54b6-4aa9-8e99-00a888d41f70\") " pod="openstack/horizon-bc5bdf456-xt2x4" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.496804 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/871f7d16-54b6-4aa9-8e99-00a888d41f70-horizon-secret-key\") pod \"horizon-bc5bdf456-xt2x4\" (UID: \"871f7d16-54b6-4aa9-8e99-00a888d41f70\") " pod="openstack/horizon-bc5bdf456-xt2x4" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.505246 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/871f7d16-54b6-4aa9-8e99-00a888d41f70-combined-ca-bundle\") pod \"horizon-bc5bdf456-xt2x4\" (UID: \"871f7d16-54b6-4aa9-8e99-00a888d41f70\") " pod="openstack/horizon-bc5bdf456-xt2x4" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.506727 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjcr2\" (UniqueName: \"kubernetes.io/projected/871f7d16-54b6-4aa9-8e99-00a888d41f70-kube-api-access-xjcr2\") pod \"horizon-bc5bdf456-xt2x4\" (UID: \"871f7d16-54b6-4aa9-8e99-00a888d41f70\") " pod="openstack/horizon-bc5bdf456-xt2x4" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.586875 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c0b4826d-75e2-4023-8d53-3ddd0da5bc2e-horizon-secret-key\") pod \"horizon-f865bb968-k9r7v\" (UID: \"c0b4826d-75e2-4023-8d53-3ddd0da5bc2e\") " pod="openstack/horizon-f865bb968-k9r7v" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.586920 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0b4826d-75e2-4023-8d53-3ddd0da5bc2e-config-data\") pod \"horizon-f865bb968-k9r7v\" (UID: \"c0b4826d-75e2-4023-8d53-3ddd0da5bc2e\") " pod="openstack/horizon-f865bb968-k9r7v" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.586954 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0b4826d-75e2-4023-8d53-3ddd0da5bc2e-scripts\") pod \"horizon-f865bb968-k9r7v\" (UID: \"c0b4826d-75e2-4023-8d53-3ddd0da5bc2e\") " pod="openstack/horizon-f865bb968-k9r7v" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.587001 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szrxn\" (UniqueName: \"kubernetes.io/projected/c0b4826d-75e2-4023-8d53-3ddd0da5bc2e-kube-api-access-szrxn\") pod \"horizon-f865bb968-k9r7v\" (UID: \"c0b4826d-75e2-4023-8d53-3ddd0da5bc2e\") " pod="openstack/horizon-f865bb968-k9r7v" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.587057 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0b4826d-75e2-4023-8d53-3ddd0da5bc2e-logs\") pod \"horizon-f865bb968-k9r7v\" (UID: \"c0b4826d-75e2-4023-8d53-3ddd0da5bc2e\") " pod="openstack/horizon-f865bb968-k9r7v" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.587081 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0b4826d-75e2-4023-8d53-3ddd0da5bc2e-combined-ca-bundle\") pod \"horizon-f865bb968-k9r7v\" (UID: \"c0b4826d-75e2-4023-8d53-3ddd0da5bc2e\") " pod="openstack/horizon-f865bb968-k9r7v" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.587111 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0b4826d-75e2-4023-8d53-3ddd0da5bc2e-horizon-tls-certs\") pod \"horizon-f865bb968-k9r7v\" (UID: \"c0b4826d-75e2-4023-8d53-3ddd0da5bc2e\") " pod="openstack/horizon-f865bb968-k9r7v" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.587740 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0b4826d-75e2-4023-8d53-3ddd0da5bc2e-logs\") pod \"horizon-f865bb968-k9r7v\" (UID: \"c0b4826d-75e2-4023-8d53-3ddd0da5bc2e\") " pod="openstack/horizon-f865bb968-k9r7v" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.588000 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0b4826d-75e2-4023-8d53-3ddd0da5bc2e-scripts\") pod \"horizon-f865bb968-k9r7v\" (UID: \"c0b4826d-75e2-4023-8d53-3ddd0da5bc2e\") " pod="openstack/horizon-f865bb968-k9r7v" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.589158 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0b4826d-75e2-4023-8d53-3ddd0da5bc2e-config-data\") pod \"horizon-f865bb968-k9r7v\" (UID: \"c0b4826d-75e2-4023-8d53-3ddd0da5bc2e\") " pod="openstack/horizon-f865bb968-k9r7v" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.590540 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0b4826d-75e2-4023-8d53-3ddd0da5bc2e-horizon-tls-certs\") pod \"horizon-f865bb968-k9r7v\" (UID: \"c0b4826d-75e2-4023-8d53-3ddd0da5bc2e\") " pod="openstack/horizon-f865bb968-k9r7v" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.591364 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0b4826d-75e2-4023-8d53-3ddd0da5bc2e-combined-ca-bundle\") pod \"horizon-f865bb968-k9r7v\" (UID: \"c0b4826d-75e2-4023-8d53-3ddd0da5bc2e\") " pod="openstack/horizon-f865bb968-k9r7v" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.593826 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c0b4826d-75e2-4023-8d53-3ddd0da5bc2e-horizon-secret-key\") pod \"horizon-f865bb968-k9r7v\" (UID: \"c0b4826d-75e2-4023-8d53-3ddd0da5bc2e\") " pod="openstack/horizon-f865bb968-k9r7v" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.607510 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szrxn\" (UniqueName: \"kubernetes.io/projected/c0b4826d-75e2-4023-8d53-3ddd0da5bc2e-kube-api-access-szrxn\") pod \"horizon-f865bb968-k9r7v\" (UID: \"c0b4826d-75e2-4023-8d53-3ddd0da5bc2e\") " pod="openstack/horizon-f865bb968-k9r7v" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.611452 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bc5bdf456-xt2x4" Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.636420 4774 generic.go:334] "Generic (PLEG): container finished" podID="403b4516-d6b7-408a-9013-6789d9d99abf" containerID="4c4b5a3f1c9d59a7a4e9565b18e824792f91d86a248402a2a146c1b168045fc3" exitCode=0 Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.636471 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-h54hg" event={"ID":"403b4516-d6b7-408a-9013-6789d9d99abf","Type":"ContainerDied","Data":"4c4b5a3f1c9d59a7a4e9565b18e824792f91d86a248402a2a146c1b168045fc3"} Oct 03 15:00:41 crc kubenswrapper[4774]: I1003 15:00:41.704960 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f865bb968-k9r7v" Oct 03 15:00:42 crc kubenswrapper[4774]: I1003 15:00:42.539885 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-h54hg" podUID="403b4516-d6b7-408a-9013-6789d9d99abf" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Oct 03 15:00:42 crc kubenswrapper[4774]: I1003 15:00:42.790200 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 15:00:42 crc kubenswrapper[4774]: I1003 15:00:42.909620 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a3cfed0-4f99-417c-830b-54217f4bad49-httpd-run\") pod \"8a3cfed0-4f99-417c-830b-54217f4bad49\" (UID: \"8a3cfed0-4f99-417c-830b-54217f4bad49\") " Oct 03 15:00:42 crc kubenswrapper[4774]: I1003 15:00:42.909705 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a3cfed0-4f99-417c-830b-54217f4bad49-config-data\") pod \"8a3cfed0-4f99-417c-830b-54217f4bad49\" (UID: \"8a3cfed0-4f99-417c-830b-54217f4bad49\") " Oct 03 15:00:42 crc kubenswrapper[4774]: I1003 15:00:42.909901 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"8a3cfed0-4f99-417c-830b-54217f4bad49\" (UID: \"8a3cfed0-4f99-417c-830b-54217f4bad49\") " Oct 03 15:00:42 crc kubenswrapper[4774]: I1003 15:00:42.909984 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a3cfed0-4f99-417c-830b-54217f4bad49-scripts\") pod \"8a3cfed0-4f99-417c-830b-54217f4bad49\" (UID: \"8a3cfed0-4f99-417c-830b-54217f4bad49\") " Oct 03 15:00:42 crc kubenswrapper[4774]: I1003 15:00:42.910019 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a3cfed0-4f99-417c-830b-54217f4bad49-combined-ca-bundle\") pod \"8a3cfed0-4f99-417c-830b-54217f4bad49\" (UID: \"8a3cfed0-4f99-417c-830b-54217f4bad49\") " Oct 03 15:00:42 crc kubenswrapper[4774]: I1003 15:00:42.910068 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a3cfed0-4f99-417c-830b-54217f4bad49-logs\") pod \"8a3cfed0-4f99-417c-830b-54217f4bad49\" (UID: \"8a3cfed0-4f99-417c-830b-54217f4bad49\") " Oct 03 15:00:42 crc kubenswrapper[4774]: I1003 15:00:42.910101 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpcdb\" (UniqueName: \"kubernetes.io/projected/8a3cfed0-4f99-417c-830b-54217f4bad49-kube-api-access-xpcdb\") pod \"8a3cfed0-4f99-417c-830b-54217f4bad49\" (UID: \"8a3cfed0-4f99-417c-830b-54217f4bad49\") " Oct 03 15:00:42 crc kubenswrapper[4774]: I1003 15:00:42.910226 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a3cfed0-4f99-417c-830b-54217f4bad49-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8a3cfed0-4f99-417c-830b-54217f4bad49" (UID: "8a3cfed0-4f99-417c-830b-54217f4bad49"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:00:42 crc kubenswrapper[4774]: I1003 15:00:42.910403 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a3cfed0-4f99-417c-830b-54217f4bad49-logs" (OuterVolumeSpecName: "logs") pod "8a3cfed0-4f99-417c-830b-54217f4bad49" (UID: "8a3cfed0-4f99-417c-830b-54217f4bad49"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:00:42 crc kubenswrapper[4774]: I1003 15:00:42.910877 4774 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a3cfed0-4f99-417c-830b-54217f4bad49-logs\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:42 crc kubenswrapper[4774]: I1003 15:00:42.910902 4774 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a3cfed0-4f99-417c-830b-54217f4bad49-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:42 crc kubenswrapper[4774]: I1003 15:00:42.914790 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a3cfed0-4f99-417c-830b-54217f4bad49-kube-api-access-xpcdb" (OuterVolumeSpecName: "kube-api-access-xpcdb") pod "8a3cfed0-4f99-417c-830b-54217f4bad49" (UID: "8a3cfed0-4f99-417c-830b-54217f4bad49"). InnerVolumeSpecName "kube-api-access-xpcdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:00:42 crc kubenswrapper[4774]: I1003 15:00:42.914988 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "8a3cfed0-4f99-417c-830b-54217f4bad49" (UID: "8a3cfed0-4f99-417c-830b-54217f4bad49"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 15:00:42 crc kubenswrapper[4774]: I1003 15:00:42.916567 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a3cfed0-4f99-417c-830b-54217f4bad49-scripts" (OuterVolumeSpecName: "scripts") pod "8a3cfed0-4f99-417c-830b-54217f4bad49" (UID: "8a3cfed0-4f99-417c-830b-54217f4bad49"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:00:42 crc kubenswrapper[4774]: I1003 15:00:42.938061 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a3cfed0-4f99-417c-830b-54217f4bad49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a3cfed0-4f99-417c-830b-54217f4bad49" (UID: "8a3cfed0-4f99-417c-830b-54217f4bad49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:00:42 crc kubenswrapper[4774]: I1003 15:00:42.951666 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a3cfed0-4f99-417c-830b-54217f4bad49-config-data" (OuterVolumeSpecName: "config-data") pod "8a3cfed0-4f99-417c-830b-54217f4bad49" (UID: "8a3cfed0-4f99-417c-830b-54217f4bad49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:00:42 crc kubenswrapper[4774]: I1003 15:00:42.974416 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.012036 4774 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.012283 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a3cfed0-4f99-417c-830b-54217f4bad49-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.012349 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a3cfed0-4f99-417c-830b-54217f4bad49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.012426 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpcdb\" (UniqueName: \"kubernetes.io/projected/8a3cfed0-4f99-417c-830b-54217f4bad49-kube-api-access-xpcdb\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.012484 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a3cfed0-4f99-417c-830b-54217f4bad49-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.044931 4774 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.113642 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\" (UID: \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\") " Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.113810 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86fdf9e4-cf58-46e2-b541-2c03fda113c5-combined-ca-bundle\") pod \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\" (UID: \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\") " Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.113924 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zr278\" (UniqueName: \"kubernetes.io/projected/86fdf9e4-cf58-46e2-b541-2c03fda113c5-kube-api-access-zr278\") pod \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\" (UID: \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\") " Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.113972 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86fdf9e4-cf58-46e2-b541-2c03fda113c5-logs\") pod \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\" (UID: \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\") " Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.114005 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/86fdf9e4-cf58-46e2-b541-2c03fda113c5-httpd-run\") pod \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\" (UID: \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\") " Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.114033 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86fdf9e4-cf58-46e2-b541-2c03fda113c5-scripts\") pod \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\" (UID: \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\") " Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.114074 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86fdf9e4-cf58-46e2-b541-2c03fda113c5-config-data\") pod \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\" (UID: \"86fdf9e4-cf58-46e2-b541-2c03fda113c5\") " Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.114405 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86fdf9e4-cf58-46e2-b541-2c03fda113c5-logs" (OuterVolumeSpecName: "logs") pod "86fdf9e4-cf58-46e2-b541-2c03fda113c5" (UID: "86fdf9e4-cf58-46e2-b541-2c03fda113c5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.114704 4774 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.114730 4774 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86fdf9e4-cf58-46e2-b541-2c03fda113c5-logs\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.114727 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86fdf9e4-cf58-46e2-b541-2c03fda113c5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "86fdf9e4-cf58-46e2-b541-2c03fda113c5" (UID: "86fdf9e4-cf58-46e2-b541-2c03fda113c5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.117751 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "86fdf9e4-cf58-46e2-b541-2c03fda113c5" (UID: "86fdf9e4-cf58-46e2-b541-2c03fda113c5"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.118058 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86fdf9e4-cf58-46e2-b541-2c03fda113c5-scripts" (OuterVolumeSpecName: "scripts") pod "86fdf9e4-cf58-46e2-b541-2c03fda113c5" (UID: "86fdf9e4-cf58-46e2-b541-2c03fda113c5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.118182 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86fdf9e4-cf58-46e2-b541-2c03fda113c5-kube-api-access-zr278" (OuterVolumeSpecName: "kube-api-access-zr278") pod "86fdf9e4-cf58-46e2-b541-2c03fda113c5" (UID: "86fdf9e4-cf58-46e2-b541-2c03fda113c5"). InnerVolumeSpecName "kube-api-access-zr278". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.146749 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86fdf9e4-cf58-46e2-b541-2c03fda113c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86fdf9e4-cf58-46e2-b541-2c03fda113c5" (UID: "86fdf9e4-cf58-46e2-b541-2c03fda113c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.160604 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86fdf9e4-cf58-46e2-b541-2c03fda113c5-config-data" (OuterVolumeSpecName: "config-data") pod "86fdf9e4-cf58-46e2-b541-2c03fda113c5" (UID: "86fdf9e4-cf58-46e2-b541-2c03fda113c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.218288 4774 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.218592 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86fdf9e4-cf58-46e2-b541-2c03fda113c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.218606 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zr278\" (UniqueName: \"kubernetes.io/projected/86fdf9e4-cf58-46e2-b541-2c03fda113c5-kube-api-access-zr278\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.218616 4774 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/86fdf9e4-cf58-46e2-b541-2c03fda113c5-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.218624 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86fdf9e4-cf58-46e2-b541-2c03fda113c5-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.218654 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86fdf9e4-cf58-46e2-b541-2c03fda113c5-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.235357 4774 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.322462 4774 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.653399 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a3cfed0-4f99-417c-830b-54217f4bad49","Type":"ContainerDied","Data":"9d30bb8a03e141a900f3a5c07ea12f2023a9968f161d75aff7f71fcf56c0e623"} Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.653453 4774 scope.go:117] "RemoveContainer" containerID="ec1218a393f5b0da2477da87608817b8eda9ddda25d8f55b89f13b0c495136c9" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.653455 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.655808 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"86fdf9e4-cf58-46e2-b541-2c03fda113c5","Type":"ContainerDied","Data":"5fb29dc367a9e6c1b221a426502d3dfe1b9af3d71a3926115bf8b935443fcb35"} Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.655910 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.696396 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.716029 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.744638 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.753561 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.766806 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 15:00:43 crc kubenswrapper[4774]: E1003 15:00:43.767351 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86fdf9e4-cf58-46e2-b541-2c03fda113c5" containerName="glance-log" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.767464 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="86fdf9e4-cf58-46e2-b541-2c03fda113c5" containerName="glance-log" Oct 03 15:00:43 crc kubenswrapper[4774]: E1003 15:00:43.767486 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86fdf9e4-cf58-46e2-b541-2c03fda113c5" containerName="glance-httpd" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.767495 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="86fdf9e4-cf58-46e2-b541-2c03fda113c5" containerName="glance-httpd" Oct 03 15:00:43 crc kubenswrapper[4774]: E1003 15:00:43.767552 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a3cfed0-4f99-417c-830b-54217f4bad49" containerName="glance-httpd" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.767567 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a3cfed0-4f99-417c-830b-54217f4bad49" containerName="glance-httpd" Oct 03 15:00:43 crc kubenswrapper[4774]: E1003 15:00:43.767614 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a3cfed0-4f99-417c-830b-54217f4bad49" containerName="glance-log" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.767625 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a3cfed0-4f99-417c-830b-54217f4bad49" containerName="glance-log" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.768826 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a3cfed0-4f99-417c-830b-54217f4bad49" containerName="glance-httpd" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.768846 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="86fdf9e4-cf58-46e2-b541-2c03fda113c5" containerName="glance-log" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.768856 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="86fdf9e4-cf58-46e2-b541-2c03fda113c5" containerName="glance-httpd" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.768866 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a3cfed0-4f99-417c-830b-54217f4bad49" containerName="glance-log" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.770510 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.772591 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.772810 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.773013 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-q7vtm" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.773167 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.779086 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.789368 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.791195 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.793995 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.794310 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.809202 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.931988 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d22e7a67-ce5f-4276-8c81-4ff98ad47524-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.932033 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22e7a67-ce5f-4276-8c81-4ff98ad47524-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.932072 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d22e7a67-ce5f-4276-8c81-4ff98ad47524-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.932088 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.932167 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d22e7a67-ce5f-4276-8c81-4ff98ad47524-logs\") pod \"glance-default-internal-api-0\" (UID: \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.932191 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74fr5\" (UniqueName: \"kubernetes.io/projected/d22e7a67-ce5f-4276-8c81-4ff98ad47524-kube-api-access-74fr5\") pod \"glance-default-internal-api-0\" (UID: \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.932387 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.932481 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-logs\") pod \"glance-default-external-api-0\" (UID: \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.932641 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.932676 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.932695 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-config-data\") pod \"glance-default-external-api-0\" (UID: \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.932754 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwnl4\" (UniqueName: \"kubernetes.io/projected/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-kube-api-access-dwnl4\") pod \"glance-default-external-api-0\" (UID: \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.932815 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-scripts\") pod \"glance-default-external-api-0\" (UID: \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.932874 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.932954 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d22e7a67-ce5f-4276-8c81-4ff98ad47524-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:43 crc kubenswrapper[4774]: I1003 15:00:43.932992 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d22e7a67-ce5f-4276-8c81-4ff98ad47524-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.034336 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.034412 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.034437 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-config-data\") pod \"glance-default-external-api-0\" (UID: \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.034480 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwnl4\" (UniqueName: \"kubernetes.io/projected/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-kube-api-access-dwnl4\") pod \"glance-default-external-api-0\" (UID: \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.034507 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-scripts\") pod \"glance-default-external-api-0\" (UID: \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.034527 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.034556 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d22e7a67-ce5f-4276-8c81-4ff98ad47524-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.034588 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d22e7a67-ce5f-4276-8c81-4ff98ad47524-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.034626 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d22e7a67-ce5f-4276-8c81-4ff98ad47524-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.034656 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22e7a67-ce5f-4276-8c81-4ff98ad47524-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.034683 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d22e7a67-ce5f-4276-8c81-4ff98ad47524-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.034704 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.034772 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d22e7a67-ce5f-4276-8c81-4ff98ad47524-logs\") pod \"glance-default-internal-api-0\" (UID: \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.034799 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74fr5\" (UniqueName: \"kubernetes.io/projected/d22e7a67-ce5f-4276-8c81-4ff98ad47524-kube-api-access-74fr5\") pod \"glance-default-internal-api-0\" (UID: \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.034806 4774 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.035437 4774 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.035503 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d22e7a67-ce5f-4276-8c81-4ff98ad47524-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.043858 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d22e7a67-ce5f-4276-8c81-4ff98ad47524-logs\") pod \"glance-default-internal-api-0\" (UID: \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.044201 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.044618 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-config-data\") pod \"glance-default-external-api-0\" (UID: \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.050596 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d22e7a67-ce5f-4276-8c81-4ff98ad47524-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.052599 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-scripts\") pod \"glance-default-external-api-0\" (UID: \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.052854 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d22e7a67-ce5f-4276-8c81-4ff98ad47524-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.053801 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22e7a67-ce5f-4276-8c81-4ff98ad47524-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.053808 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.053940 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-logs\") pod \"glance-default-external-api-0\" (UID: \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.054260 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-logs\") pod \"glance-default-external-api-0\" (UID: \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.054535 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.056608 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74fr5\" (UniqueName: \"kubernetes.io/projected/d22e7a67-ce5f-4276-8c81-4ff98ad47524-kube-api-access-74fr5\") pod \"glance-default-internal-api-0\" (UID: \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.058458 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.059195 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d22e7a67-ce5f-4276-8c81-4ff98ad47524-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.074731 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwnl4\" (UniqueName: \"kubernetes.io/projected/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-kube-api-access-dwnl4\") pod \"glance-default-external-api-0\" (UID: \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.077111 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.091276 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\") " pod="openstack/glance-default-external-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.097102 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 15:00:44 crc kubenswrapper[4774]: I1003 15:00:44.110588 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 15:00:45 crc kubenswrapper[4774]: I1003 15:00:45.310027 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86fdf9e4-cf58-46e2-b541-2c03fda113c5" path="/var/lib/kubelet/pods/86fdf9e4-cf58-46e2-b541-2c03fda113c5/volumes" Oct 03 15:00:45 crc kubenswrapper[4774]: I1003 15:00:45.311127 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a3cfed0-4f99-417c-830b-54217f4bad49" path="/var/lib/kubelet/pods/8a3cfed0-4f99-417c-830b-54217f4bad49/volumes" Oct 03 15:00:47 crc kubenswrapper[4774]: I1003 15:00:47.539839 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-h54hg" podUID="403b4516-d6b7-408a-9013-6789d9d99abf" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Oct 03 15:00:52 crc kubenswrapper[4774]: I1003 15:00:52.540203 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-h54hg" podUID="403b4516-d6b7-408a-9013-6789d9d99abf" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Oct 03 15:00:52 crc kubenswrapper[4774]: I1003 15:00:52.541186 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-h54hg" Oct 03 15:00:55 crc kubenswrapper[4774]: I1003 15:00:55.567447 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vtqvj" Oct 03 15:00:55 crc kubenswrapper[4774]: I1003 15:00:55.669254 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f303caa2-7916-4bdc-ba01-b5c55166dc53-credential-keys\") pod \"f303caa2-7916-4bdc-ba01-b5c55166dc53\" (UID: \"f303caa2-7916-4bdc-ba01-b5c55166dc53\") " Oct 03 15:00:55 crc kubenswrapper[4774]: I1003 15:00:55.669751 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f303caa2-7916-4bdc-ba01-b5c55166dc53-combined-ca-bundle\") pod \"f303caa2-7916-4bdc-ba01-b5c55166dc53\" (UID: \"f303caa2-7916-4bdc-ba01-b5c55166dc53\") " Oct 03 15:00:55 crc kubenswrapper[4774]: I1003 15:00:55.669816 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f303caa2-7916-4bdc-ba01-b5c55166dc53-scripts\") pod \"f303caa2-7916-4bdc-ba01-b5c55166dc53\" (UID: \"f303caa2-7916-4bdc-ba01-b5c55166dc53\") " Oct 03 15:00:55 crc kubenswrapper[4774]: I1003 15:00:55.669851 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f303caa2-7916-4bdc-ba01-b5c55166dc53-fernet-keys\") pod \"f303caa2-7916-4bdc-ba01-b5c55166dc53\" (UID: \"f303caa2-7916-4bdc-ba01-b5c55166dc53\") " Oct 03 15:00:55 crc kubenswrapper[4774]: I1003 15:00:55.669943 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f303caa2-7916-4bdc-ba01-b5c55166dc53-config-data\") pod \"f303caa2-7916-4bdc-ba01-b5c55166dc53\" (UID: \"f303caa2-7916-4bdc-ba01-b5c55166dc53\") " Oct 03 15:00:55 crc kubenswrapper[4774]: I1003 15:00:55.670006 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glr9f\" (UniqueName: \"kubernetes.io/projected/f303caa2-7916-4bdc-ba01-b5c55166dc53-kube-api-access-glr9f\") pod \"f303caa2-7916-4bdc-ba01-b5c55166dc53\" (UID: \"f303caa2-7916-4bdc-ba01-b5c55166dc53\") " Oct 03 15:00:55 crc kubenswrapper[4774]: I1003 15:00:55.675149 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f303caa2-7916-4bdc-ba01-b5c55166dc53-kube-api-access-glr9f" (OuterVolumeSpecName: "kube-api-access-glr9f") pod "f303caa2-7916-4bdc-ba01-b5c55166dc53" (UID: "f303caa2-7916-4bdc-ba01-b5c55166dc53"). InnerVolumeSpecName "kube-api-access-glr9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:00:55 crc kubenswrapper[4774]: I1003 15:00:55.675713 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f303caa2-7916-4bdc-ba01-b5c55166dc53-scripts" (OuterVolumeSpecName: "scripts") pod "f303caa2-7916-4bdc-ba01-b5c55166dc53" (UID: "f303caa2-7916-4bdc-ba01-b5c55166dc53"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:00:55 crc kubenswrapper[4774]: I1003 15:00:55.675858 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f303caa2-7916-4bdc-ba01-b5c55166dc53-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f303caa2-7916-4bdc-ba01-b5c55166dc53" (UID: "f303caa2-7916-4bdc-ba01-b5c55166dc53"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:00:55 crc kubenswrapper[4774]: I1003 15:00:55.698335 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f303caa2-7916-4bdc-ba01-b5c55166dc53-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f303caa2-7916-4bdc-ba01-b5c55166dc53" (UID: "f303caa2-7916-4bdc-ba01-b5c55166dc53"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:00:55 crc kubenswrapper[4774]: I1003 15:00:55.698846 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f303caa2-7916-4bdc-ba01-b5c55166dc53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f303caa2-7916-4bdc-ba01-b5c55166dc53" (UID: "f303caa2-7916-4bdc-ba01-b5c55166dc53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:00:55 crc kubenswrapper[4774]: I1003 15:00:55.710718 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f303caa2-7916-4bdc-ba01-b5c55166dc53-config-data" (OuterVolumeSpecName: "config-data") pod "f303caa2-7916-4bdc-ba01-b5c55166dc53" (UID: "f303caa2-7916-4bdc-ba01-b5c55166dc53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:00:55 crc kubenswrapper[4774]: I1003 15:00:55.763870 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vtqvj" event={"ID":"f303caa2-7916-4bdc-ba01-b5c55166dc53","Type":"ContainerDied","Data":"9b0924bb85f439a17915bd2ac19c49c3804b210b914bc4bca6d54da4b65c8934"} Oct 03 15:00:55 crc kubenswrapper[4774]: I1003 15:00:55.763914 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b0924bb85f439a17915bd2ac19c49c3804b210b914bc4bca6d54da4b65c8934" Oct 03 15:00:55 crc kubenswrapper[4774]: I1003 15:00:55.763975 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vtqvj" Oct 03 15:00:55 crc kubenswrapper[4774]: I1003 15:00:55.772259 4774 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f303caa2-7916-4bdc-ba01-b5c55166dc53-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:55 crc kubenswrapper[4774]: I1003 15:00:55.772283 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f303caa2-7916-4bdc-ba01-b5c55166dc53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:55 crc kubenswrapper[4774]: I1003 15:00:55.772293 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f303caa2-7916-4bdc-ba01-b5c55166dc53-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:55 crc kubenswrapper[4774]: I1003 15:00:55.772301 4774 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f303caa2-7916-4bdc-ba01-b5c55166dc53-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:55 crc kubenswrapper[4774]: I1003 15:00:55.772309 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f303caa2-7916-4bdc-ba01-b5c55166dc53-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:55 crc kubenswrapper[4774]: I1003 15:00:55.772318 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glr9f\" (UniqueName: \"kubernetes.io/projected/f303caa2-7916-4bdc-ba01-b5c55166dc53-kube-api-access-glr9f\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:55 crc kubenswrapper[4774]: E1003 15:00:55.899506 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Oct 03 15:00:55 crc kubenswrapper[4774]: E1003 15:00:55.899681 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n644h674h665h5cch5bdh548h6dh5d5h7bh645hdbh697h66bh76h586h5fch585h579hd7h9bh85h576h5d8h5f9h77hfbh57hf4h54fh96hb4hb5q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fmndq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(d9e41eee-4655-4cc2-b01d-37d1f947011b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 15:00:56 crc kubenswrapper[4774]: I1003 15:00:56.745249 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-vtqvj"] Oct 03 15:00:56 crc kubenswrapper[4774]: I1003 15:00:56.754423 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-vtqvj"] Oct 03 15:00:56 crc kubenswrapper[4774]: I1003 15:00:56.849404 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-w7hcx"] Oct 03 15:00:56 crc kubenswrapper[4774]: E1003 15:00:56.849859 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f303caa2-7916-4bdc-ba01-b5c55166dc53" containerName="keystone-bootstrap" Oct 03 15:00:56 crc kubenswrapper[4774]: I1003 15:00:56.849877 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f303caa2-7916-4bdc-ba01-b5c55166dc53" containerName="keystone-bootstrap" Oct 03 15:00:56 crc kubenswrapper[4774]: I1003 15:00:56.850104 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="f303caa2-7916-4bdc-ba01-b5c55166dc53" containerName="keystone-bootstrap" Oct 03 15:00:56 crc kubenswrapper[4774]: I1003 15:00:56.850869 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-w7hcx" Oct 03 15:00:56 crc kubenswrapper[4774]: I1003 15:00:56.852544 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-5rssc" Oct 03 15:00:56 crc kubenswrapper[4774]: I1003 15:00:56.854598 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 15:00:56 crc kubenswrapper[4774]: I1003 15:00:56.854599 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 15:00:56 crc kubenswrapper[4774]: I1003 15:00:56.854852 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 15:00:56 crc kubenswrapper[4774]: I1003 15:00:56.864112 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-w7hcx"] Oct 03 15:00:56 crc kubenswrapper[4774]: I1003 15:00:56.991865 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7755d164-f1c7-4f58-91d4-5ac4ab948090-credential-keys\") pod \"keystone-bootstrap-w7hcx\" (UID: \"7755d164-f1c7-4f58-91d4-5ac4ab948090\") " pod="openstack/keystone-bootstrap-w7hcx" Oct 03 15:00:56 crc kubenswrapper[4774]: I1003 15:00:56.991949 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7755d164-f1c7-4f58-91d4-5ac4ab948090-config-data\") pod \"keystone-bootstrap-w7hcx\" (UID: \"7755d164-f1c7-4f58-91d4-5ac4ab948090\") " pod="openstack/keystone-bootstrap-w7hcx" Oct 03 15:00:56 crc kubenswrapper[4774]: I1003 15:00:56.992000 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x2gc\" (UniqueName: \"kubernetes.io/projected/7755d164-f1c7-4f58-91d4-5ac4ab948090-kube-api-access-2x2gc\") pod \"keystone-bootstrap-w7hcx\" (UID: \"7755d164-f1c7-4f58-91d4-5ac4ab948090\") " pod="openstack/keystone-bootstrap-w7hcx" Oct 03 15:00:56 crc kubenswrapper[4774]: I1003 15:00:56.992023 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7755d164-f1c7-4f58-91d4-5ac4ab948090-fernet-keys\") pod \"keystone-bootstrap-w7hcx\" (UID: \"7755d164-f1c7-4f58-91d4-5ac4ab948090\") " pod="openstack/keystone-bootstrap-w7hcx" Oct 03 15:00:56 crc kubenswrapper[4774]: I1003 15:00:56.992155 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7755d164-f1c7-4f58-91d4-5ac4ab948090-scripts\") pod \"keystone-bootstrap-w7hcx\" (UID: \"7755d164-f1c7-4f58-91d4-5ac4ab948090\") " pod="openstack/keystone-bootstrap-w7hcx" Oct 03 15:00:56 crc kubenswrapper[4774]: I1003 15:00:56.992259 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7755d164-f1c7-4f58-91d4-5ac4ab948090-combined-ca-bundle\") pod \"keystone-bootstrap-w7hcx\" (UID: \"7755d164-f1c7-4f58-91d4-5ac4ab948090\") " pod="openstack/keystone-bootstrap-w7hcx" Oct 03 15:00:57 crc kubenswrapper[4774]: E1003 15:00:57.017208 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 03 15:00:57 crc kubenswrapper[4774]: E1003 15:00:57.017421 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-972jg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-sx6cl_openstack(be96775d-6115-4c8e-8539-230de2424b0e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 15:00:57 crc kubenswrapper[4774]: E1003 15:00:57.019592 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-sx6cl" podUID="be96775d-6115-4c8e-8539-230de2424b0e" Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.030464 4774 scope.go:117] "RemoveContainer" containerID="42120485ed44e93ecb0a6580d47ee5ba1d2a7ce990cea659352a3d49a42540ea" Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.094471 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7755d164-f1c7-4f58-91d4-5ac4ab948090-credential-keys\") pod \"keystone-bootstrap-w7hcx\" (UID: \"7755d164-f1c7-4f58-91d4-5ac4ab948090\") " pod="openstack/keystone-bootstrap-w7hcx" Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.094599 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7755d164-f1c7-4f58-91d4-5ac4ab948090-config-data\") pod \"keystone-bootstrap-w7hcx\" (UID: \"7755d164-f1c7-4f58-91d4-5ac4ab948090\") " pod="openstack/keystone-bootstrap-w7hcx" Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.094631 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x2gc\" (UniqueName: \"kubernetes.io/projected/7755d164-f1c7-4f58-91d4-5ac4ab948090-kube-api-access-2x2gc\") pod \"keystone-bootstrap-w7hcx\" (UID: \"7755d164-f1c7-4f58-91d4-5ac4ab948090\") " pod="openstack/keystone-bootstrap-w7hcx" Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.094656 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7755d164-f1c7-4f58-91d4-5ac4ab948090-fernet-keys\") pod \"keystone-bootstrap-w7hcx\" (UID: \"7755d164-f1c7-4f58-91d4-5ac4ab948090\") " pod="openstack/keystone-bootstrap-w7hcx" Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.094727 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7755d164-f1c7-4f58-91d4-5ac4ab948090-scripts\") pod \"keystone-bootstrap-w7hcx\" (UID: \"7755d164-f1c7-4f58-91d4-5ac4ab948090\") " pod="openstack/keystone-bootstrap-w7hcx" Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.094811 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7755d164-f1c7-4f58-91d4-5ac4ab948090-combined-ca-bundle\") pod \"keystone-bootstrap-w7hcx\" (UID: \"7755d164-f1c7-4f58-91d4-5ac4ab948090\") " pod="openstack/keystone-bootstrap-w7hcx" Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.115158 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7755d164-f1c7-4f58-91d4-5ac4ab948090-scripts\") pod \"keystone-bootstrap-w7hcx\" (UID: \"7755d164-f1c7-4f58-91d4-5ac4ab948090\") " pod="openstack/keystone-bootstrap-w7hcx" Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.115430 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7755d164-f1c7-4f58-91d4-5ac4ab948090-credential-keys\") pod \"keystone-bootstrap-w7hcx\" (UID: \"7755d164-f1c7-4f58-91d4-5ac4ab948090\") " pod="openstack/keystone-bootstrap-w7hcx" Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.116431 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x2gc\" (UniqueName: \"kubernetes.io/projected/7755d164-f1c7-4f58-91d4-5ac4ab948090-kube-api-access-2x2gc\") pod \"keystone-bootstrap-w7hcx\" (UID: \"7755d164-f1c7-4f58-91d4-5ac4ab948090\") " pod="openstack/keystone-bootstrap-w7hcx" Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.116542 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7755d164-f1c7-4f58-91d4-5ac4ab948090-config-data\") pod \"keystone-bootstrap-w7hcx\" (UID: \"7755d164-f1c7-4f58-91d4-5ac4ab948090\") " pod="openstack/keystone-bootstrap-w7hcx" Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.116623 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7755d164-f1c7-4f58-91d4-5ac4ab948090-fernet-keys\") pod \"keystone-bootstrap-w7hcx\" (UID: \"7755d164-f1c7-4f58-91d4-5ac4ab948090\") " pod="openstack/keystone-bootstrap-w7hcx" Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.116744 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7755d164-f1c7-4f58-91d4-5ac4ab948090-combined-ca-bundle\") pod \"keystone-bootstrap-w7hcx\" (UID: \"7755d164-f1c7-4f58-91d4-5ac4ab948090\") " pod="openstack/keystone-bootstrap-w7hcx" Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.178785 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-w7hcx" Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.312844 4774 scope.go:117] "RemoveContainer" containerID="ceb81e14262147ea8dc69fb535a3c45b4247cf8388afc17bf35cb2cca414dba0" Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.317932 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-h54hg" Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.327905 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f303caa2-7916-4bdc-ba01-b5c55166dc53" path="/var/lib/kubelet/pods/f303caa2-7916-4bdc-ba01-b5c55166dc53/volumes" Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.373082 4774 scope.go:117] "RemoveContainer" containerID="6616c8e3faf8b095b06438175cdf1a0ac16119db9528bea6127e766683cad29d" Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.403866 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/403b4516-d6b7-408a-9013-6789d9d99abf-ovsdbserver-sb\") pod \"403b4516-d6b7-408a-9013-6789d9d99abf\" (UID: \"403b4516-d6b7-408a-9013-6789d9d99abf\") " Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.404003 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/403b4516-d6b7-408a-9013-6789d9d99abf-ovsdbserver-nb\") pod \"403b4516-d6b7-408a-9013-6789d9d99abf\" (UID: \"403b4516-d6b7-408a-9013-6789d9d99abf\") " Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.404126 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qt5b\" (UniqueName: \"kubernetes.io/projected/403b4516-d6b7-408a-9013-6789d9d99abf-kube-api-access-8qt5b\") pod \"403b4516-d6b7-408a-9013-6789d9d99abf\" (UID: \"403b4516-d6b7-408a-9013-6789d9d99abf\") " Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.404294 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/403b4516-d6b7-408a-9013-6789d9d99abf-dns-svc\") pod \"403b4516-d6b7-408a-9013-6789d9d99abf\" (UID: \"403b4516-d6b7-408a-9013-6789d9d99abf\") " Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.404341 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/403b4516-d6b7-408a-9013-6789d9d99abf-config\") pod \"403b4516-d6b7-408a-9013-6789d9d99abf\" (UID: \"403b4516-d6b7-408a-9013-6789d9d99abf\") " Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.427395 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/403b4516-d6b7-408a-9013-6789d9d99abf-kube-api-access-8qt5b" (OuterVolumeSpecName: "kube-api-access-8qt5b") pod "403b4516-d6b7-408a-9013-6789d9d99abf" (UID: "403b4516-d6b7-408a-9013-6789d9d99abf"). InnerVolumeSpecName "kube-api-access-8qt5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.507761 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qt5b\" (UniqueName: \"kubernetes.io/projected/403b4516-d6b7-408a-9013-6789d9d99abf-kube-api-access-8qt5b\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.512919 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/403b4516-d6b7-408a-9013-6789d9d99abf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "403b4516-d6b7-408a-9013-6789d9d99abf" (UID: "403b4516-d6b7-408a-9013-6789d9d99abf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.536793 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/403b4516-d6b7-408a-9013-6789d9d99abf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "403b4516-d6b7-408a-9013-6789d9d99abf" (UID: "403b4516-d6b7-408a-9013-6789d9d99abf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.537283 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/403b4516-d6b7-408a-9013-6789d9d99abf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "403b4516-d6b7-408a-9013-6789d9d99abf" (UID: "403b4516-d6b7-408a-9013-6789d9d99abf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.558288 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/403b4516-d6b7-408a-9013-6789d9d99abf-config" (OuterVolumeSpecName: "config") pod "403b4516-d6b7-408a-9013-6789d9d99abf" (UID: "403b4516-d6b7-408a-9013-6789d9d99abf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.582881 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5fbc-account-create-czgkc"] Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.588675 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f865bb968-k9r7v"] Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.611432 4774 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/403b4516-d6b7-408a-9013-6789d9d99abf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.611468 4774 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/403b4516-d6b7-408a-9013-6789d9d99abf-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.611480 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/403b4516-d6b7-408a-9013-6789d9d99abf-config\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.611491 4774 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/403b4516-d6b7-408a-9013-6789d9d99abf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:57 crc kubenswrapper[4774]: W1003 15:00:57.612103 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod066209ed_a6f1_4363_89d7_ad3a9b865341.slice/crio-1cb249d8fa9c924b0e39e0d7dc29a971c540e95253e1df2d0448314a26f0417e WatchSource:0}: Error finding container 1cb249d8fa9c924b0e39e0d7dc29a971c540e95253e1df2d0448314a26f0417e: Status 404 returned error can't find the container with id 1cb249d8fa9c924b0e39e0d7dc29a971c540e95253e1df2d0448314a26f0417e Oct 03 15:00:57 crc kubenswrapper[4774]: W1003 15:00:57.613244 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0b4826d_75e2_4023_8d53_3ddd0da5bc2e.slice/crio-28b877417284bed883f4213af10a5015a508351ef7c8fc21ed934b866240600a WatchSource:0}: Error finding container 28b877417284bed883f4213af10a5015a508351ef7c8fc21ed934b866240600a: Status 404 returned error can't find the container with id 28b877417284bed883f4213af10a5015a508351ef7c8fc21ed934b866240600a Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.726230 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-2589-account-create-zksfs"] Oct 03 15:00:57 crc kubenswrapper[4774]: W1003 15:00:57.735309 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod871f7d16_54b6_4aa9_8e99_00a888d41f70.slice/crio-e2a6a155f04b0405eecdb9c21f2fbeee3a4261ebaa974d7ee00a51cecd0f7e5d WatchSource:0}: Error finding container e2a6a155f04b0405eecdb9c21f2fbeee3a4261ebaa974d7ee00a51cecd0f7e5d: Status 404 returned error can't find the container with id e2a6a155f04b0405eecdb9c21f2fbeee3a4261ebaa974d7ee00a51cecd0f7e5d Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.736202 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-bc5bdf456-xt2x4"] Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.780988 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-587669876f-dmzh9" event={"ID":"b635609a-ab1c-4691-be86-da83abc3e663","Type":"ContainerStarted","Data":"d0e93c9919c9cc5e5ca30ec385dfc4897c2c951e0d054737ed3f23c89bc0759d"} Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.783264 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2589-account-create-zksfs" event={"ID":"fcf3e07f-94a6-4786-9ef9-a7cb9f1562b0","Type":"ContainerStarted","Data":"19b74bd48fdf50aeb3417bfacd04d67bed1f4671255202c903cbce5848fab1bd"} Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.784078 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f865bb968-k9r7v" event={"ID":"c0b4826d-75e2-4023-8d53-3ddd0da5bc2e","Type":"ContainerStarted","Data":"28b877417284bed883f4213af10a5015a508351ef7c8fc21ed934b866240600a"} Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.785488 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-f2ct5" event={"ID":"ce469a02-5678-42c1-84d7-21a29c1b3d18","Type":"ContainerStarted","Data":"57bcf11aa1d70c62aae6c53fcb319fd0412bb5ee497f9656611a4b4bf0ea9c52"} Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.787192 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-h54hg" event={"ID":"403b4516-d6b7-408a-9013-6789d9d99abf","Type":"ContainerDied","Data":"1647faa3af5c1796971e13c27367bc28b4a0153995ec8757a35cd5b4e37c99c7"} Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.787220 4774 scope.go:117] "RemoveContainer" containerID="4c4b5a3f1c9d59a7a4e9565b18e824792f91d86a248402a2a146c1b168045fc3" Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.787309 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-h54hg" Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.808521 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fbc-account-create-czgkc" event={"ID":"066209ed-a6f1-4363-89d7-ad3a9b865341","Type":"ContainerStarted","Data":"1cb249d8fa9c924b0e39e0d7dc29a971c540e95253e1df2d0448314a26f0417e"} Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.811547 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bc5bdf456-xt2x4" event={"ID":"871f7d16-54b6-4aa9-8e99-00a888d41f70","Type":"ContainerStarted","Data":"e2a6a155f04b0405eecdb9c21f2fbeee3a4261ebaa974d7ee00a51cecd0f7e5d"} Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.859942 4774 scope.go:117] "RemoveContainer" containerID="d15b8be91d87262ff0eac3104ca8c81ceeed645e51cdffd75373e46913752aa5" Oct 03 15:00:57 crc kubenswrapper[4774]: E1003 15:00:57.860068 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-sx6cl" podUID="be96775d-6115-4c8e-8539-230de2424b0e" Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.889759 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-h54hg"] Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.890458 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-h54hg"] Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.911393 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-w7hcx"] Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.921844 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 15:00:57 crc kubenswrapper[4774]: I1003 15:00:57.997973 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 15:00:58 crc kubenswrapper[4774]: W1003 15:00:58.176985 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a4f000d_fdd6_46ec_b6b4_55a574ca801d.slice/crio-5c026e66286f4cab647ff08fc98b578a9428bfab2688d1b9902f51799f920575 WatchSource:0}: Error finding container 5c026e66286f4cab647ff08fc98b578a9428bfab2688d1b9902f51799f920575: Status 404 returned error can't find the container with id 5c026e66286f4cab647ff08fc98b578a9428bfab2688d1b9902f51799f920575 Oct 03 15:00:58 crc kubenswrapper[4774]: W1003 15:00:58.184596 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7755d164_f1c7_4f58_91d4_5ac4ab948090.slice/crio-23a9b1d970a7625eee219ff58409756f5478de09235140d077fde17e856cae8d WatchSource:0}: Error finding container 23a9b1d970a7625eee219ff58409756f5478de09235140d077fde17e856cae8d: Status 404 returned error can't find the container with id 23a9b1d970a7625eee219ff58409756f5478de09235140d077fde17e856cae8d Oct 03 15:00:58 crc kubenswrapper[4774]: W1003 15:00:58.193064 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd22e7a67_ce5f_4276_8c81_4ff98ad47524.slice/crio-4e22c7a548eb1c3c4a50856562b595d69f813a1f922151e747a321dd48b4ada3 WatchSource:0}: Error finding container 4e22c7a548eb1c3c4a50856562b595d69f813a1f922151e747a321dd48b4ada3: Status 404 returned error can't find the container with id 4e22c7a548eb1c3c4a50856562b595d69f813a1f922151e747a321dd48b4ada3 Oct 03 15:00:58 crc kubenswrapper[4774]: I1003 15:00:58.852767 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c54b8bfd5-ftr47" event={"ID":"fc47f9df-51ec-4aad-861b-c04b1321c5a3","Type":"ContainerStarted","Data":"247a9528a6cf7a41ed40e9c7c10c6a24c2c7ff0accb2b9421e74e4e566487827"} Oct 03 15:00:58 crc kubenswrapper[4774]: I1003 15:00:58.853066 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c54b8bfd5-ftr47" event={"ID":"fc47f9df-51ec-4aad-861b-c04b1321c5a3","Type":"ContainerStarted","Data":"2a9b54c31753f543f9667511be61815d3355dd815fbe91ce4cfab5a8a287e841"} Oct 03 15:00:58 crc kubenswrapper[4774]: I1003 15:00:58.853122 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c54b8bfd5-ftr47" podUID="fc47f9df-51ec-4aad-861b-c04b1321c5a3" containerName="horizon-log" containerID="cri-o://2a9b54c31753f543f9667511be61815d3355dd815fbe91ce4cfab5a8a287e841" gracePeriod=30 Oct 03 15:00:58 crc kubenswrapper[4774]: I1003 15:00:58.853264 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c54b8bfd5-ftr47" podUID="fc47f9df-51ec-4aad-861b-c04b1321c5a3" containerName="horizon" containerID="cri-o://247a9528a6cf7a41ed40e9c7c10c6a24c2c7ff0accb2b9421e74e4e566487827" gracePeriod=30 Oct 03 15:00:58 crc kubenswrapper[4774]: I1003 15:00:58.871225 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bc5bdf456-xt2x4" event={"ID":"871f7d16-54b6-4aa9-8e99-00a888d41f70","Type":"ContainerStarted","Data":"af677314a00ddf8adf8d016fd4a1bdce9452562c501a73facceeff916c357b0b"} Oct 03 15:00:58 crc kubenswrapper[4774]: I1003 15:00:58.871264 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bc5bdf456-xt2x4" event={"ID":"871f7d16-54b6-4aa9-8e99-00a888d41f70","Type":"ContainerStarted","Data":"4c3807337ee7f2b8e44c4122821db338ac04f20ba3e7962936f507b835bfe6b9"} Oct 03 15:00:58 crc kubenswrapper[4774]: I1003 15:00:58.873084 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d22e7a67-ce5f-4276-8c81-4ff98ad47524","Type":"ContainerStarted","Data":"4e22c7a548eb1c3c4a50856562b595d69f813a1f922151e747a321dd48b4ada3"} Oct 03 15:00:58 crc kubenswrapper[4774]: I1003 15:00:58.878524 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7c54b8bfd5-ftr47" podStartSLOduration=3.761480117 podStartE2EDuration="28.878510124s" podCreationTimestamp="2025-10-03 15:00:30 +0000 UTC" firstStartedPulling="2025-10-03 15:00:31.955440755 +0000 UTC m=+1054.544644207" lastFinishedPulling="2025-10-03 15:00:57.072470752 +0000 UTC m=+1079.661674214" observedRunningTime="2025-10-03 15:00:58.875998252 +0000 UTC m=+1081.465201704" watchObservedRunningTime="2025-10-03 15:00:58.878510124 +0000 UTC m=+1081.467713576" Oct 03 15:00:58 crc kubenswrapper[4774]: I1003 15:00:58.881748 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9e41eee-4655-4cc2-b01d-37d1f947011b","Type":"ContainerStarted","Data":"243f3d3655fd1960cd360dac4b6b275a2de3079b6eb4a7aa1a8e22aa14983188"} Oct 03 15:00:58 crc kubenswrapper[4774]: I1003 15:00:58.891554 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-587669876f-dmzh9" event={"ID":"b635609a-ab1c-4691-be86-da83abc3e663","Type":"ContainerStarted","Data":"be81210244b0fa3606b3f1432526e204cc82f1277bf8001d1f74902aad26e524"} Oct 03 15:00:58 crc kubenswrapper[4774]: I1003 15:00:58.891717 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-587669876f-dmzh9" podUID="b635609a-ab1c-4691-be86-da83abc3e663" containerName="horizon-log" containerID="cri-o://d0e93c9919c9cc5e5ca30ec385dfc4897c2c951e0d054737ed3f23c89bc0759d" gracePeriod=30 Oct 03 15:00:58 crc kubenswrapper[4774]: I1003 15:00:58.891959 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-587669876f-dmzh9" podUID="b635609a-ab1c-4691-be86-da83abc3e663" containerName="horizon" containerID="cri-o://be81210244b0fa3606b3f1432526e204cc82f1277bf8001d1f74902aad26e524" gracePeriod=30 Oct 03 15:00:58 crc kubenswrapper[4774]: I1003 15:00:58.903948 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-765f99df69-hhjpp" event={"ID":"e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a","Type":"ContainerStarted","Data":"070d22da48804090a3d16634343e38782fa1cfae91b0c20b52e74ac8cb456ea9"} Oct 03 15:00:58 crc kubenswrapper[4774]: I1003 15:00:58.904003 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-765f99df69-hhjpp" event={"ID":"e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a","Type":"ContainerStarted","Data":"0d7e8b2630f6e23e079e498a4d690a2d7552d2b105fae7efa228ba39fd181246"} Oct 03 15:00:58 crc kubenswrapper[4774]: I1003 15:00:58.904154 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-765f99df69-hhjpp" podUID="e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a" containerName="horizon-log" containerID="cri-o://0d7e8b2630f6e23e079e498a4d690a2d7552d2b105fae7efa228ba39fd181246" gracePeriod=30 Oct 03 15:00:58 crc kubenswrapper[4774]: I1003 15:00:58.904455 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-765f99df69-hhjpp" podUID="e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a" containerName="horizon" containerID="cri-o://070d22da48804090a3d16634343e38782fa1cfae91b0c20b52e74ac8cb456ea9" gracePeriod=30 Oct 03 15:00:58 crc kubenswrapper[4774]: I1003 15:00:58.908214 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-bc5bdf456-xt2x4" podStartSLOduration=17.908156403 podStartE2EDuration="17.908156403s" podCreationTimestamp="2025-10-03 15:00:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:00:58.897154889 +0000 UTC m=+1081.486358351" watchObservedRunningTime="2025-10-03 15:00:58.908156403 +0000 UTC m=+1081.497359855" Oct 03 15:00:58 crc kubenswrapper[4774]: I1003 15:00:58.911805 4774 generic.go:334] "Generic (PLEG): container finished" podID="fcf3e07f-94a6-4786-9ef9-a7cb9f1562b0" containerID="ddf6f6d44c9b2467a9d47d4ffb5acdd138bfec1bd41c008d98dbe3fbd1096afb" exitCode=0 Oct 03 15:00:58 crc kubenswrapper[4774]: I1003 15:00:58.911888 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2589-account-create-zksfs" event={"ID":"fcf3e07f-94a6-4786-9ef9-a7cb9f1562b0","Type":"ContainerDied","Data":"ddf6f6d44c9b2467a9d47d4ffb5acdd138bfec1bd41c008d98dbe3fbd1096afb"} Oct 03 15:00:58 crc kubenswrapper[4774]: I1003 15:00:58.935859 4774 generic.go:334] "Generic (PLEG): container finished" podID="066209ed-a6f1-4363-89d7-ad3a9b865341" containerID="ab01957b32518039713a0e69498e4214d0b7b55eeb3d23b81bf87c15881fa6f6" exitCode=0 Oct 03 15:00:58 crc kubenswrapper[4774]: I1003 15:00:58.936023 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fbc-account-create-czgkc" event={"ID":"066209ed-a6f1-4363-89d7-ad3a9b865341","Type":"ContainerDied","Data":"ab01957b32518039713a0e69498e4214d0b7b55eeb3d23b81bf87c15881fa6f6"} Oct 03 15:00:58 crc kubenswrapper[4774]: I1003 15:00:58.942544 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-w7hcx" event={"ID":"7755d164-f1c7-4f58-91d4-5ac4ab948090","Type":"ContainerStarted","Data":"81e49b5c658d68212ef1d604598bf80afc030621e90b6eb6d556d4fc17280e35"} Oct 03 15:00:58 crc kubenswrapper[4774]: I1003 15:00:58.942603 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-w7hcx" event={"ID":"7755d164-f1c7-4f58-91d4-5ac4ab948090","Type":"ContainerStarted","Data":"23a9b1d970a7625eee219ff58409756f5478de09235140d077fde17e856cae8d"} Oct 03 15:00:58 crc kubenswrapper[4774]: I1003 15:00:58.948865 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-587669876f-dmzh9" podStartSLOduration=4.344877698 podStartE2EDuration="29.948838466s" podCreationTimestamp="2025-10-03 15:00:29 +0000 UTC" firstStartedPulling="2025-10-03 15:00:31.382737037 +0000 UTC m=+1053.971940489" lastFinishedPulling="2025-10-03 15:00:56.986697805 +0000 UTC m=+1079.575901257" observedRunningTime="2025-10-03 15:00:58.920280605 +0000 UTC m=+1081.509484057" watchObservedRunningTime="2025-10-03 15:00:58.948838466 +0000 UTC m=+1081.538041938" Oct 03 15:00:58 crc kubenswrapper[4774]: I1003 15:00:58.966487 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f865bb968-k9r7v" event={"ID":"c0b4826d-75e2-4023-8d53-3ddd0da5bc2e","Type":"ContainerStarted","Data":"1990edae3c818166e8be5e6e5c7ef3a774db7f898c3805fb250065ea98e62a0b"} Oct 03 15:00:58 crc kubenswrapper[4774]: I1003 15:00:58.966540 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f865bb968-k9r7v" event={"ID":"c0b4826d-75e2-4023-8d53-3ddd0da5bc2e","Type":"ContainerStarted","Data":"06835b30cd8fb8baca56b78a5c4e7b520c393eebcac1a3aa2330a560602b2c5e"} Oct 03 15:00:58 crc kubenswrapper[4774]: I1003 15:00:58.972433 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5a4f000d-fdd6-46ec-b6b4-55a574ca801d","Type":"ContainerStarted","Data":"5c026e66286f4cab647ff08fc98b578a9428bfab2688d1b9902f51799f920575"} Oct 03 15:00:58 crc kubenswrapper[4774]: I1003 15:00:58.980936 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-765f99df69-hhjpp" podStartSLOduration=3.094085557 podStartE2EDuration="25.980907765s" podCreationTimestamp="2025-10-03 15:00:33 +0000 UTC" firstStartedPulling="2025-10-03 15:00:34.199574561 +0000 UTC m=+1056.788778013" lastFinishedPulling="2025-10-03 15:00:57.086396769 +0000 UTC m=+1079.675600221" observedRunningTime="2025-10-03 15:00:58.974992508 +0000 UTC m=+1081.564195960" watchObservedRunningTime="2025-10-03 15:00:58.980907765 +0000 UTC m=+1081.570111237" Oct 03 15:00:59 crc kubenswrapper[4774]: I1003 15:00:59.007286 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-f2ct5" podStartSLOduration=3.769730953 podStartE2EDuration="29.007264642s" podCreationTimestamp="2025-10-03 15:00:30 +0000 UTC" firstStartedPulling="2025-10-03 15:00:31.722661186 +0000 UTC m=+1054.311864648" lastFinishedPulling="2025-10-03 15:00:56.960194885 +0000 UTC m=+1079.549398337" observedRunningTime="2025-10-03 15:00:59.004812261 +0000 UTC m=+1081.594015723" watchObservedRunningTime="2025-10-03 15:00:59.007264642 +0000 UTC m=+1081.596468094" Oct 03 15:00:59 crc kubenswrapper[4774]: I1003 15:00:59.063645 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-f865bb968-k9r7v" podStartSLOduration=18.063624666 podStartE2EDuration="18.063624666s" podCreationTimestamp="2025-10-03 15:00:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:00:59.048169581 +0000 UTC m=+1081.637373033" watchObservedRunningTime="2025-10-03 15:00:59.063624666 +0000 UTC m=+1081.652828118" Oct 03 15:00:59 crc kubenswrapper[4774]: I1003 15:00:59.149794 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-w7hcx" podStartSLOduration=3.149767332 podStartE2EDuration="3.149767332s" podCreationTimestamp="2025-10-03 15:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:00:59.102921885 +0000 UTC m=+1081.692125347" watchObservedRunningTime="2025-10-03 15:00:59.149767332 +0000 UTC m=+1081.738970784" Oct 03 15:00:59 crc kubenswrapper[4774]: I1003 15:00:59.318928 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="403b4516-d6b7-408a-9013-6789d9d99abf" path="/var/lib/kubelet/pods/403b4516-d6b7-408a-9013-6789d9d99abf/volumes" Oct 03 15:00:59 crc kubenswrapper[4774]: I1003 15:00:59.984279 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5a4f000d-fdd6-46ec-b6b4-55a574ca801d","Type":"ContainerStarted","Data":"6dadf5c7a6046e152b21be98bc4a11f540148be2f9f8dbd1628a8d83d1d43ea2"} Oct 03 15:00:59 crc kubenswrapper[4774]: I1003 15:00:59.984642 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5a4f000d-fdd6-46ec-b6b4-55a574ca801d","Type":"ContainerStarted","Data":"b463abc99cf516a57bf6debf4dde0935b961cbb0865fa8230aa54cdca3c79d65"} Oct 03 15:00:59 crc kubenswrapper[4774]: I1003 15:00:59.990703 4774 generic.go:334] "Generic (PLEG): container finished" podID="ce469a02-5678-42c1-84d7-21a29c1b3d18" containerID="57bcf11aa1d70c62aae6c53fcb319fd0412bb5ee497f9656611a4b4bf0ea9c52" exitCode=0 Oct 03 15:00:59 crc kubenswrapper[4774]: I1003 15:00:59.990778 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-f2ct5" event={"ID":"ce469a02-5678-42c1-84d7-21a29c1b3d18","Type":"ContainerDied","Data":"57bcf11aa1d70c62aae6c53fcb319fd0412bb5ee497f9656611a4b4bf0ea9c52"} Oct 03 15:00:59 crc kubenswrapper[4774]: I1003 15:00:59.996940 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d22e7a67-ce5f-4276-8c81-4ff98ad47524","Type":"ContainerStarted","Data":"7d261663daa84edecac8cf5c077af37a0c44d06245a72aace4d5d6876340f6c5"} Oct 03 15:00:59 crc kubenswrapper[4774]: I1003 15:00:59.996984 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d22e7a67-ce5f-4276-8c81-4ff98ad47524","Type":"ContainerStarted","Data":"6649f825244c5a3ab28f6b97034ed4582119b4d3d49dcfc4805e3f1cf51f54d5"} Oct 03 15:01:00 crc kubenswrapper[4774]: I1003 15:01:00.011169 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=17.01115334 podStartE2EDuration="17.01115334s" podCreationTimestamp="2025-10-03 15:00:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:01:00.010792311 +0000 UTC m=+1082.599995763" watchObservedRunningTime="2025-10-03 15:01:00.01115334 +0000 UTC m=+1082.600356792" Oct 03 15:01:00 crc kubenswrapper[4774]: I1003 15:01:00.064135 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=17.06411669 podStartE2EDuration="17.06411669s" podCreationTimestamp="2025-10-03 15:00:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:01:00.056057069 +0000 UTC m=+1082.645260521" watchObservedRunningTime="2025-10-03 15:01:00.06411669 +0000 UTC m=+1082.653320142" Oct 03 15:01:00 crc kubenswrapper[4774]: I1003 15:01:00.372758 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-587669876f-dmzh9" Oct 03 15:01:00 crc kubenswrapper[4774]: I1003 15:01:00.617881 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2589-account-create-zksfs" Oct 03 15:01:00 crc kubenswrapper[4774]: I1003 15:01:00.625292 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5fbc-account-create-czgkc" Oct 03 15:01:00 crc kubenswrapper[4774]: I1003 15:01:00.697875 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47ndl\" (UniqueName: \"kubernetes.io/projected/fcf3e07f-94a6-4786-9ef9-a7cb9f1562b0-kube-api-access-47ndl\") pod \"fcf3e07f-94a6-4786-9ef9-a7cb9f1562b0\" (UID: \"fcf3e07f-94a6-4786-9ef9-a7cb9f1562b0\") " Oct 03 15:01:00 crc kubenswrapper[4774]: I1003 15:01:00.698020 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vg4m2\" (UniqueName: \"kubernetes.io/projected/066209ed-a6f1-4363-89d7-ad3a9b865341-kube-api-access-vg4m2\") pod \"066209ed-a6f1-4363-89d7-ad3a9b865341\" (UID: \"066209ed-a6f1-4363-89d7-ad3a9b865341\") " Oct 03 15:01:00 crc kubenswrapper[4774]: I1003 15:01:00.703840 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcf3e07f-94a6-4786-9ef9-a7cb9f1562b0-kube-api-access-47ndl" (OuterVolumeSpecName: "kube-api-access-47ndl") pod "fcf3e07f-94a6-4786-9ef9-a7cb9f1562b0" (UID: "fcf3e07f-94a6-4786-9ef9-a7cb9f1562b0"). InnerVolumeSpecName "kube-api-access-47ndl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:01:00 crc kubenswrapper[4774]: I1003 15:01:00.724838 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/066209ed-a6f1-4363-89d7-ad3a9b865341-kube-api-access-vg4m2" (OuterVolumeSpecName: "kube-api-access-vg4m2") pod "066209ed-a6f1-4363-89d7-ad3a9b865341" (UID: "066209ed-a6f1-4363-89d7-ad3a9b865341"). InnerVolumeSpecName "kube-api-access-vg4m2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:01:00 crc kubenswrapper[4774]: I1003 15:01:00.800032 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vg4m2\" (UniqueName: \"kubernetes.io/projected/066209ed-a6f1-4363-89d7-ad3a9b865341-kube-api-access-vg4m2\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:00 crc kubenswrapper[4774]: I1003 15:01:00.800336 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47ndl\" (UniqueName: \"kubernetes.io/projected/fcf3e07f-94a6-4786-9ef9-a7cb9f1562b0-kube-api-access-47ndl\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:01 crc kubenswrapper[4774]: I1003 15:01:01.006383 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2589-account-create-zksfs" event={"ID":"fcf3e07f-94a6-4786-9ef9-a7cb9f1562b0","Type":"ContainerDied","Data":"19b74bd48fdf50aeb3417bfacd04d67bed1f4671255202c903cbce5848fab1bd"} Oct 03 15:01:01 crc kubenswrapper[4774]: I1003 15:01:01.006418 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19b74bd48fdf50aeb3417bfacd04d67bed1f4671255202c903cbce5848fab1bd" Oct 03 15:01:01 crc kubenswrapper[4774]: I1003 15:01:01.006388 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2589-account-create-zksfs" Oct 03 15:01:01 crc kubenswrapper[4774]: I1003 15:01:01.009271 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fbc-account-create-czgkc" event={"ID":"066209ed-a6f1-4363-89d7-ad3a9b865341","Type":"ContainerDied","Data":"1cb249d8fa9c924b0e39e0d7dc29a971c540e95253e1df2d0448314a26f0417e"} Oct 03 15:01:01 crc kubenswrapper[4774]: I1003 15:01:01.009315 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cb249d8fa9c924b0e39e0d7dc29a971c540e95253e1df2d0448314a26f0417e" Oct 03 15:01:01 crc kubenswrapper[4774]: I1003 15:01:01.009627 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5fbc-account-create-czgkc" Oct 03 15:01:01 crc kubenswrapper[4774]: I1003 15:01:01.016415 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7c54b8bfd5-ftr47" Oct 03 15:01:01 crc kubenswrapper[4774]: I1003 15:01:01.277226 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-f2ct5" Oct 03 15:01:01 crc kubenswrapper[4774]: I1003 15:01:01.410593 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce469a02-5678-42c1-84d7-21a29c1b3d18-config-data\") pod \"ce469a02-5678-42c1-84d7-21a29c1b3d18\" (UID: \"ce469a02-5678-42c1-84d7-21a29c1b3d18\") " Oct 03 15:01:01 crc kubenswrapper[4774]: I1003 15:01:01.410664 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b569b\" (UniqueName: \"kubernetes.io/projected/ce469a02-5678-42c1-84d7-21a29c1b3d18-kube-api-access-b569b\") pod \"ce469a02-5678-42c1-84d7-21a29c1b3d18\" (UID: \"ce469a02-5678-42c1-84d7-21a29c1b3d18\") " Oct 03 15:01:01 crc kubenswrapper[4774]: I1003 15:01:01.410739 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce469a02-5678-42c1-84d7-21a29c1b3d18-logs\") pod \"ce469a02-5678-42c1-84d7-21a29c1b3d18\" (UID: \"ce469a02-5678-42c1-84d7-21a29c1b3d18\") " Oct 03 15:01:01 crc kubenswrapper[4774]: I1003 15:01:01.410790 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce469a02-5678-42c1-84d7-21a29c1b3d18-scripts\") pod \"ce469a02-5678-42c1-84d7-21a29c1b3d18\" (UID: \"ce469a02-5678-42c1-84d7-21a29c1b3d18\") " Oct 03 15:01:01 crc kubenswrapper[4774]: I1003 15:01:01.410822 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce469a02-5678-42c1-84d7-21a29c1b3d18-combined-ca-bundle\") pod \"ce469a02-5678-42c1-84d7-21a29c1b3d18\" (UID: \"ce469a02-5678-42c1-84d7-21a29c1b3d18\") " Oct 03 15:01:01 crc kubenswrapper[4774]: I1003 15:01:01.411336 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce469a02-5678-42c1-84d7-21a29c1b3d18-logs" (OuterVolumeSpecName: "logs") pod "ce469a02-5678-42c1-84d7-21a29c1b3d18" (UID: "ce469a02-5678-42c1-84d7-21a29c1b3d18"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:01:01 crc kubenswrapper[4774]: I1003 15:01:01.411513 4774 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce469a02-5678-42c1-84d7-21a29c1b3d18-logs\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:01 crc kubenswrapper[4774]: I1003 15:01:01.416775 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce469a02-5678-42c1-84d7-21a29c1b3d18-kube-api-access-b569b" (OuterVolumeSpecName: "kube-api-access-b569b") pod "ce469a02-5678-42c1-84d7-21a29c1b3d18" (UID: "ce469a02-5678-42c1-84d7-21a29c1b3d18"). InnerVolumeSpecName "kube-api-access-b569b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:01:01 crc kubenswrapper[4774]: I1003 15:01:01.422751 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce469a02-5678-42c1-84d7-21a29c1b3d18-scripts" (OuterVolumeSpecName: "scripts") pod "ce469a02-5678-42c1-84d7-21a29c1b3d18" (UID: "ce469a02-5678-42c1-84d7-21a29c1b3d18"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:01 crc kubenswrapper[4774]: I1003 15:01:01.451434 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce469a02-5678-42c1-84d7-21a29c1b3d18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce469a02-5678-42c1-84d7-21a29c1b3d18" (UID: "ce469a02-5678-42c1-84d7-21a29c1b3d18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:01 crc kubenswrapper[4774]: I1003 15:01:01.454291 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce469a02-5678-42c1-84d7-21a29c1b3d18-config-data" (OuterVolumeSpecName: "config-data") pod "ce469a02-5678-42c1-84d7-21a29c1b3d18" (UID: "ce469a02-5678-42c1-84d7-21a29c1b3d18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:01 crc kubenswrapper[4774]: I1003 15:01:01.512665 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce469a02-5678-42c1-84d7-21a29c1b3d18-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:01 crc kubenswrapper[4774]: I1003 15:01:01.512703 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce469a02-5678-42c1-84d7-21a29c1b3d18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:01 crc kubenswrapper[4774]: I1003 15:01:01.512714 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce469a02-5678-42c1-84d7-21a29c1b3d18-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:01 crc kubenswrapper[4774]: I1003 15:01:01.512723 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b569b\" (UniqueName: \"kubernetes.io/projected/ce469a02-5678-42c1-84d7-21a29c1b3d18-kube-api-access-b569b\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:01 crc kubenswrapper[4774]: I1003 15:01:01.611974 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-bc5bdf456-xt2x4" Oct 03 15:01:01 crc kubenswrapper[4774]: I1003 15:01:01.612286 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-bc5bdf456-xt2x4" Oct 03 15:01:01 crc kubenswrapper[4774]: I1003 15:01:01.705818 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-f865bb968-k9r7v" Oct 03 15:01:01 crc kubenswrapper[4774]: I1003 15:01:01.707306 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-f865bb968-k9r7v" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.061849 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-f2ct5" event={"ID":"ce469a02-5678-42c1-84d7-21a29c1b3d18","Type":"ContainerDied","Data":"7de213dd285b152b26d64cd6f97bcc2ea847e6c2e4f2a2a72cb7d0aebda99d89"} Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.061909 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7de213dd285b152b26d64cd6f97bcc2ea847e6c2e4f2a2a72cb7d0aebda99d89" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.062015 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-f2ct5" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.112496 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-bc86db558-frxdt"] Oct 03 15:01:02 crc kubenswrapper[4774]: E1003 15:01:02.112826 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="403b4516-d6b7-408a-9013-6789d9d99abf" containerName="dnsmasq-dns" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.112840 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="403b4516-d6b7-408a-9013-6789d9d99abf" containerName="dnsmasq-dns" Oct 03 15:01:02 crc kubenswrapper[4774]: E1003 15:01:02.112852 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="403b4516-d6b7-408a-9013-6789d9d99abf" containerName="init" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.112858 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="403b4516-d6b7-408a-9013-6789d9d99abf" containerName="init" Oct 03 15:01:02 crc kubenswrapper[4774]: E1003 15:01:02.112868 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="066209ed-a6f1-4363-89d7-ad3a9b865341" containerName="mariadb-account-create" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.112876 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="066209ed-a6f1-4363-89d7-ad3a9b865341" containerName="mariadb-account-create" Oct 03 15:01:02 crc kubenswrapper[4774]: E1003 15:01:02.112893 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcf3e07f-94a6-4786-9ef9-a7cb9f1562b0" containerName="mariadb-account-create" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.112899 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcf3e07f-94a6-4786-9ef9-a7cb9f1562b0" containerName="mariadb-account-create" Oct 03 15:01:02 crc kubenswrapper[4774]: E1003 15:01:02.112924 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce469a02-5678-42c1-84d7-21a29c1b3d18" containerName="placement-db-sync" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.112930 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce469a02-5678-42c1-84d7-21a29c1b3d18" containerName="placement-db-sync" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.113086 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="066209ed-a6f1-4363-89d7-ad3a9b865341" containerName="mariadb-account-create" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.113101 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce469a02-5678-42c1-84d7-21a29c1b3d18" containerName="placement-db-sync" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.113113 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="403b4516-d6b7-408a-9013-6789d9d99abf" containerName="dnsmasq-dns" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.113126 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcf3e07f-94a6-4786-9ef9-a7cb9f1562b0" containerName="mariadb-account-create" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.114072 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-bc86db558-frxdt" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.120796 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-tncn6" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.121294 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.121422 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.121576 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.121751 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.133490 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-bc86db558-frxdt"] Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.224940 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e952bb63-3c66-43a3-a8ef-34e636f1b400-scripts\") pod \"placement-bc86db558-frxdt\" (UID: \"e952bb63-3c66-43a3-a8ef-34e636f1b400\") " pod="openstack/placement-bc86db558-frxdt" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.225088 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e952bb63-3c66-43a3-a8ef-34e636f1b400-internal-tls-certs\") pod \"placement-bc86db558-frxdt\" (UID: \"e952bb63-3c66-43a3-a8ef-34e636f1b400\") " pod="openstack/placement-bc86db558-frxdt" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.225120 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e952bb63-3c66-43a3-a8ef-34e636f1b400-combined-ca-bundle\") pod \"placement-bc86db558-frxdt\" (UID: \"e952bb63-3c66-43a3-a8ef-34e636f1b400\") " pod="openstack/placement-bc86db558-frxdt" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.225144 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqt6l\" (UniqueName: \"kubernetes.io/projected/e952bb63-3c66-43a3-a8ef-34e636f1b400-kube-api-access-vqt6l\") pod \"placement-bc86db558-frxdt\" (UID: \"e952bb63-3c66-43a3-a8ef-34e636f1b400\") " pod="openstack/placement-bc86db558-frxdt" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.225227 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e952bb63-3c66-43a3-a8ef-34e636f1b400-logs\") pod \"placement-bc86db558-frxdt\" (UID: \"e952bb63-3c66-43a3-a8ef-34e636f1b400\") " pod="openstack/placement-bc86db558-frxdt" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.225259 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e952bb63-3c66-43a3-a8ef-34e636f1b400-config-data\") pod \"placement-bc86db558-frxdt\" (UID: \"e952bb63-3c66-43a3-a8ef-34e636f1b400\") " pod="openstack/placement-bc86db558-frxdt" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.225279 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e952bb63-3c66-43a3-a8ef-34e636f1b400-public-tls-certs\") pod \"placement-bc86db558-frxdt\" (UID: \"e952bb63-3c66-43a3-a8ef-34e636f1b400\") " pod="openstack/placement-bc86db558-frxdt" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.326824 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e952bb63-3c66-43a3-a8ef-34e636f1b400-scripts\") pod \"placement-bc86db558-frxdt\" (UID: \"e952bb63-3c66-43a3-a8ef-34e636f1b400\") " pod="openstack/placement-bc86db558-frxdt" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.326943 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e952bb63-3c66-43a3-a8ef-34e636f1b400-internal-tls-certs\") pod \"placement-bc86db558-frxdt\" (UID: \"e952bb63-3c66-43a3-a8ef-34e636f1b400\") " pod="openstack/placement-bc86db558-frxdt" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.326981 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e952bb63-3c66-43a3-a8ef-34e636f1b400-combined-ca-bundle\") pod \"placement-bc86db558-frxdt\" (UID: \"e952bb63-3c66-43a3-a8ef-34e636f1b400\") " pod="openstack/placement-bc86db558-frxdt" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.327005 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqt6l\" (UniqueName: \"kubernetes.io/projected/e952bb63-3c66-43a3-a8ef-34e636f1b400-kube-api-access-vqt6l\") pod \"placement-bc86db558-frxdt\" (UID: \"e952bb63-3c66-43a3-a8ef-34e636f1b400\") " pod="openstack/placement-bc86db558-frxdt" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.327079 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e952bb63-3c66-43a3-a8ef-34e636f1b400-logs\") pod \"placement-bc86db558-frxdt\" (UID: \"e952bb63-3c66-43a3-a8ef-34e636f1b400\") " pod="openstack/placement-bc86db558-frxdt" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.327111 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e952bb63-3c66-43a3-a8ef-34e636f1b400-config-data\") pod \"placement-bc86db558-frxdt\" (UID: \"e952bb63-3c66-43a3-a8ef-34e636f1b400\") " pod="openstack/placement-bc86db558-frxdt" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.327135 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e952bb63-3c66-43a3-a8ef-34e636f1b400-public-tls-certs\") pod \"placement-bc86db558-frxdt\" (UID: \"e952bb63-3c66-43a3-a8ef-34e636f1b400\") " pod="openstack/placement-bc86db558-frxdt" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.332237 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e952bb63-3c66-43a3-a8ef-34e636f1b400-logs\") pod \"placement-bc86db558-frxdt\" (UID: \"e952bb63-3c66-43a3-a8ef-34e636f1b400\") " pod="openstack/placement-bc86db558-frxdt" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.333005 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e952bb63-3c66-43a3-a8ef-34e636f1b400-public-tls-certs\") pod \"placement-bc86db558-frxdt\" (UID: \"e952bb63-3c66-43a3-a8ef-34e636f1b400\") " pod="openstack/placement-bc86db558-frxdt" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.333396 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e952bb63-3c66-43a3-a8ef-34e636f1b400-internal-tls-certs\") pod \"placement-bc86db558-frxdt\" (UID: \"e952bb63-3c66-43a3-a8ef-34e636f1b400\") " pod="openstack/placement-bc86db558-frxdt" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.334008 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e952bb63-3c66-43a3-a8ef-34e636f1b400-config-data\") pod \"placement-bc86db558-frxdt\" (UID: \"e952bb63-3c66-43a3-a8ef-34e636f1b400\") " pod="openstack/placement-bc86db558-frxdt" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.335576 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e952bb63-3c66-43a3-a8ef-34e636f1b400-combined-ca-bundle\") pod \"placement-bc86db558-frxdt\" (UID: \"e952bb63-3c66-43a3-a8ef-34e636f1b400\") " pod="openstack/placement-bc86db558-frxdt" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.336129 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e952bb63-3c66-43a3-a8ef-34e636f1b400-scripts\") pod \"placement-bc86db558-frxdt\" (UID: \"e952bb63-3c66-43a3-a8ef-34e636f1b400\") " pod="openstack/placement-bc86db558-frxdt" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.346157 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqt6l\" (UniqueName: \"kubernetes.io/projected/e952bb63-3c66-43a3-a8ef-34e636f1b400-kube-api-access-vqt6l\") pod \"placement-bc86db558-frxdt\" (UID: \"e952bb63-3c66-43a3-a8ef-34e636f1b400\") " pod="openstack/placement-bc86db558-frxdt" Oct 03 15:01:02 crc kubenswrapper[4774]: I1003 15:01:02.451968 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-bc86db558-frxdt" Oct 03 15:01:03 crc kubenswrapper[4774]: I1003 15:01:03.071508 4774 generic.go:334] "Generic (PLEG): container finished" podID="7755d164-f1c7-4f58-91d4-5ac4ab948090" containerID="81e49b5c658d68212ef1d604598bf80afc030621e90b6eb6d556d4fc17280e35" exitCode=0 Oct 03 15:01:03 crc kubenswrapper[4774]: I1003 15:01:03.071842 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-w7hcx" event={"ID":"7755d164-f1c7-4f58-91d4-5ac4ab948090","Type":"ContainerDied","Data":"81e49b5c658d68212ef1d604598bf80afc030621e90b6eb6d556d4fc17280e35"} Oct 03 15:01:03 crc kubenswrapper[4774]: I1003 15:01:03.649651 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-765f99df69-hhjpp" Oct 03 15:01:04 crc kubenswrapper[4774]: I1003 15:01:04.098275 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 15:01:04 crc kubenswrapper[4774]: I1003 15:01:04.098360 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 15:01:04 crc kubenswrapper[4774]: I1003 15:01:04.111047 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 15:01:04 crc kubenswrapper[4774]: I1003 15:01:04.111299 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 15:01:04 crc kubenswrapper[4774]: I1003 15:01:04.146200 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 15:01:04 crc kubenswrapper[4774]: I1003 15:01:04.155074 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 15:01:04 crc kubenswrapper[4774]: I1003 15:01:04.180277 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 15:01:04 crc kubenswrapper[4774]: I1003 15:01:04.207127 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 15:01:05 crc kubenswrapper[4774]: I1003 15:01:05.090152 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 15:01:05 crc kubenswrapper[4774]: I1003 15:01:05.090199 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 15:01:05 crc kubenswrapper[4774]: I1003 15:01:05.090214 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 15:01:05 crc kubenswrapper[4774]: I1003 15:01:05.090389 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 15:01:06 crc kubenswrapper[4774]: I1003 15:01:06.088343 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-gp26k"] Oct 03 15:01:06 crc kubenswrapper[4774]: I1003 15:01:06.089594 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gp26k" Oct 03 15:01:06 crc kubenswrapper[4774]: I1003 15:01:06.096650 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-5zx57" Oct 03 15:01:06 crc kubenswrapper[4774]: I1003 15:01:06.096944 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 03 15:01:06 crc kubenswrapper[4774]: I1003 15:01:06.107089 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-gp26k"] Oct 03 15:01:06 crc kubenswrapper[4774]: I1003 15:01:06.200433 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-66tpj"] Oct 03 15:01:06 crc kubenswrapper[4774]: I1003 15:01:06.201666 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-66tpj" Oct 03 15:01:06 crc kubenswrapper[4774]: I1003 15:01:06.206022 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-gmbc7" Oct 03 15:01:06 crc kubenswrapper[4774]: I1003 15:01:06.206315 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 03 15:01:06 crc kubenswrapper[4774]: I1003 15:01:06.206544 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 03 15:01:06 crc kubenswrapper[4774]: I1003 15:01:06.219047 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-66tpj"] Oct 03 15:01:06 crc kubenswrapper[4774]: I1003 15:01:06.220692 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29f3857a-d03c-48ab-93c4-0d75fc497c0e-db-sync-config-data\") pod \"barbican-db-sync-gp26k\" (UID: \"29f3857a-d03c-48ab-93c4-0d75fc497c0e\") " pod="openstack/barbican-db-sync-gp26k" Oct 03 15:01:06 crc kubenswrapper[4774]: I1003 15:01:06.220854 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8sr2\" (UniqueName: \"kubernetes.io/projected/29f3857a-d03c-48ab-93c4-0d75fc497c0e-kube-api-access-v8sr2\") pod \"barbican-db-sync-gp26k\" (UID: \"29f3857a-d03c-48ab-93c4-0d75fc497c0e\") " pod="openstack/barbican-db-sync-gp26k" Oct 03 15:01:06 crc kubenswrapper[4774]: I1003 15:01:06.220942 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29f3857a-d03c-48ab-93c4-0d75fc497c0e-combined-ca-bundle\") pod \"barbican-db-sync-gp26k\" (UID: \"29f3857a-d03c-48ab-93c4-0d75fc497c0e\") " pod="openstack/barbican-db-sync-gp26k" Oct 03 15:01:06 crc kubenswrapper[4774]: I1003 15:01:06.322724 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cdc24810-0778-4b37-8156-ecac9ae9e077-config\") pod \"neutron-db-sync-66tpj\" (UID: \"cdc24810-0778-4b37-8156-ecac9ae9e077\") " pod="openstack/neutron-db-sync-66tpj" Oct 03 15:01:06 crc kubenswrapper[4774]: I1003 15:01:06.322812 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd6c6\" (UniqueName: \"kubernetes.io/projected/cdc24810-0778-4b37-8156-ecac9ae9e077-kube-api-access-qd6c6\") pod \"neutron-db-sync-66tpj\" (UID: \"cdc24810-0778-4b37-8156-ecac9ae9e077\") " pod="openstack/neutron-db-sync-66tpj" Oct 03 15:01:06 crc kubenswrapper[4774]: I1003 15:01:06.322840 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29f3857a-d03c-48ab-93c4-0d75fc497c0e-db-sync-config-data\") pod \"barbican-db-sync-gp26k\" (UID: \"29f3857a-d03c-48ab-93c4-0d75fc497c0e\") " pod="openstack/barbican-db-sync-gp26k" Oct 03 15:01:06 crc kubenswrapper[4774]: I1003 15:01:06.322883 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8sr2\" (UniqueName: \"kubernetes.io/projected/29f3857a-d03c-48ab-93c4-0d75fc497c0e-kube-api-access-v8sr2\") pod \"barbican-db-sync-gp26k\" (UID: \"29f3857a-d03c-48ab-93c4-0d75fc497c0e\") " pod="openstack/barbican-db-sync-gp26k" Oct 03 15:01:06 crc kubenswrapper[4774]: I1003 15:01:06.322907 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdc24810-0778-4b37-8156-ecac9ae9e077-combined-ca-bundle\") pod \"neutron-db-sync-66tpj\" (UID: \"cdc24810-0778-4b37-8156-ecac9ae9e077\") " pod="openstack/neutron-db-sync-66tpj" Oct 03 15:01:06 crc kubenswrapper[4774]: I1003 15:01:06.322940 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29f3857a-d03c-48ab-93c4-0d75fc497c0e-combined-ca-bundle\") pod \"barbican-db-sync-gp26k\" (UID: \"29f3857a-d03c-48ab-93c4-0d75fc497c0e\") " pod="openstack/barbican-db-sync-gp26k" Oct 03 15:01:06 crc kubenswrapper[4774]: I1003 15:01:06.328556 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29f3857a-d03c-48ab-93c4-0d75fc497c0e-combined-ca-bundle\") pod \"barbican-db-sync-gp26k\" (UID: \"29f3857a-d03c-48ab-93c4-0d75fc497c0e\") " pod="openstack/barbican-db-sync-gp26k" Oct 03 15:01:06 crc kubenswrapper[4774]: I1003 15:01:06.328895 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29f3857a-d03c-48ab-93c4-0d75fc497c0e-db-sync-config-data\") pod \"barbican-db-sync-gp26k\" (UID: \"29f3857a-d03c-48ab-93c4-0d75fc497c0e\") " pod="openstack/barbican-db-sync-gp26k" Oct 03 15:01:06 crc kubenswrapper[4774]: I1003 15:01:06.342097 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8sr2\" (UniqueName: \"kubernetes.io/projected/29f3857a-d03c-48ab-93c4-0d75fc497c0e-kube-api-access-v8sr2\") pod \"barbican-db-sync-gp26k\" (UID: \"29f3857a-d03c-48ab-93c4-0d75fc497c0e\") " pod="openstack/barbican-db-sync-gp26k" Oct 03 15:01:06 crc kubenswrapper[4774]: I1003 15:01:06.423194 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gp26k" Oct 03 15:01:06 crc kubenswrapper[4774]: I1003 15:01:06.424605 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd6c6\" (UniqueName: \"kubernetes.io/projected/cdc24810-0778-4b37-8156-ecac9ae9e077-kube-api-access-qd6c6\") pod \"neutron-db-sync-66tpj\" (UID: \"cdc24810-0778-4b37-8156-ecac9ae9e077\") " pod="openstack/neutron-db-sync-66tpj" Oct 03 15:01:06 crc kubenswrapper[4774]: I1003 15:01:06.424699 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdc24810-0778-4b37-8156-ecac9ae9e077-combined-ca-bundle\") pod \"neutron-db-sync-66tpj\" (UID: \"cdc24810-0778-4b37-8156-ecac9ae9e077\") " pod="openstack/neutron-db-sync-66tpj" Oct 03 15:01:06 crc kubenswrapper[4774]: I1003 15:01:06.424768 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cdc24810-0778-4b37-8156-ecac9ae9e077-config\") pod \"neutron-db-sync-66tpj\" (UID: \"cdc24810-0778-4b37-8156-ecac9ae9e077\") " pod="openstack/neutron-db-sync-66tpj" Oct 03 15:01:06 crc kubenswrapper[4774]: I1003 15:01:06.429482 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cdc24810-0778-4b37-8156-ecac9ae9e077-config\") pod \"neutron-db-sync-66tpj\" (UID: \"cdc24810-0778-4b37-8156-ecac9ae9e077\") " pod="openstack/neutron-db-sync-66tpj" Oct 03 15:01:06 crc kubenswrapper[4774]: I1003 15:01:06.432280 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdc24810-0778-4b37-8156-ecac9ae9e077-combined-ca-bundle\") pod \"neutron-db-sync-66tpj\" (UID: \"cdc24810-0778-4b37-8156-ecac9ae9e077\") " pod="openstack/neutron-db-sync-66tpj" Oct 03 15:01:06 crc kubenswrapper[4774]: I1003 15:01:06.448620 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd6c6\" (UniqueName: \"kubernetes.io/projected/cdc24810-0778-4b37-8156-ecac9ae9e077-kube-api-access-qd6c6\") pod \"neutron-db-sync-66tpj\" (UID: \"cdc24810-0778-4b37-8156-ecac9ae9e077\") " pod="openstack/neutron-db-sync-66tpj" Oct 03 15:01:06 crc kubenswrapper[4774]: I1003 15:01:06.535140 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-66tpj" Oct 03 15:01:07 crc kubenswrapper[4774]: I1003 15:01:07.110200 4774 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 15:01:07 crc kubenswrapper[4774]: I1003 15:01:07.110228 4774 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 15:01:07 crc kubenswrapper[4774]: I1003 15:01:07.110377 4774 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 15:01:07 crc kubenswrapper[4774]: I1003 15:01:07.110396 4774 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 15:01:07 crc kubenswrapper[4774]: I1003 15:01:07.311240 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 15:01:07 crc kubenswrapper[4774]: I1003 15:01:07.783452 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 15:01:07 crc kubenswrapper[4774]: I1003 15:01:07.949113 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 15:01:08 crc kubenswrapper[4774]: I1003 15:01:08.017915 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-w7hcx" Oct 03 15:01:08 crc kubenswrapper[4774]: I1003 15:01:08.070568 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7755d164-f1c7-4f58-91d4-5ac4ab948090-config-data\") pod \"7755d164-f1c7-4f58-91d4-5ac4ab948090\" (UID: \"7755d164-f1c7-4f58-91d4-5ac4ab948090\") " Oct 03 15:01:08 crc kubenswrapper[4774]: I1003 15:01:08.070641 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x2gc\" (UniqueName: \"kubernetes.io/projected/7755d164-f1c7-4f58-91d4-5ac4ab948090-kube-api-access-2x2gc\") pod \"7755d164-f1c7-4f58-91d4-5ac4ab948090\" (UID: \"7755d164-f1c7-4f58-91d4-5ac4ab948090\") " Oct 03 15:01:08 crc kubenswrapper[4774]: I1003 15:01:08.070701 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7755d164-f1c7-4f58-91d4-5ac4ab948090-scripts\") pod \"7755d164-f1c7-4f58-91d4-5ac4ab948090\" (UID: \"7755d164-f1c7-4f58-91d4-5ac4ab948090\") " Oct 03 15:01:08 crc kubenswrapper[4774]: I1003 15:01:08.070821 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7755d164-f1c7-4f58-91d4-5ac4ab948090-credential-keys\") pod \"7755d164-f1c7-4f58-91d4-5ac4ab948090\" (UID: \"7755d164-f1c7-4f58-91d4-5ac4ab948090\") " Oct 03 15:01:08 crc kubenswrapper[4774]: I1003 15:01:08.070882 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7755d164-f1c7-4f58-91d4-5ac4ab948090-combined-ca-bundle\") pod \"7755d164-f1c7-4f58-91d4-5ac4ab948090\" (UID: \"7755d164-f1c7-4f58-91d4-5ac4ab948090\") " Oct 03 15:01:08 crc kubenswrapper[4774]: I1003 15:01:08.070935 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7755d164-f1c7-4f58-91d4-5ac4ab948090-fernet-keys\") pod \"7755d164-f1c7-4f58-91d4-5ac4ab948090\" (UID: \"7755d164-f1c7-4f58-91d4-5ac4ab948090\") " Oct 03 15:01:08 crc kubenswrapper[4774]: I1003 15:01:08.089662 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7755d164-f1c7-4f58-91d4-5ac4ab948090-scripts" (OuterVolumeSpecName: "scripts") pod "7755d164-f1c7-4f58-91d4-5ac4ab948090" (UID: "7755d164-f1c7-4f58-91d4-5ac4ab948090"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:08 crc kubenswrapper[4774]: I1003 15:01:08.097192 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7755d164-f1c7-4f58-91d4-5ac4ab948090-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7755d164-f1c7-4f58-91d4-5ac4ab948090" (UID: "7755d164-f1c7-4f58-91d4-5ac4ab948090"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:08 crc kubenswrapper[4774]: I1003 15:01:08.099662 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7755d164-f1c7-4f58-91d4-5ac4ab948090-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "7755d164-f1c7-4f58-91d4-5ac4ab948090" (UID: "7755d164-f1c7-4f58-91d4-5ac4ab948090"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:08 crc kubenswrapper[4774]: I1003 15:01:08.099688 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7755d164-f1c7-4f58-91d4-5ac4ab948090-kube-api-access-2x2gc" (OuterVolumeSpecName: "kube-api-access-2x2gc") pod "7755d164-f1c7-4f58-91d4-5ac4ab948090" (UID: "7755d164-f1c7-4f58-91d4-5ac4ab948090"). InnerVolumeSpecName "kube-api-access-2x2gc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:01:08 crc kubenswrapper[4774]: I1003 15:01:08.146573 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 15:01:08 crc kubenswrapper[4774]: I1003 15:01:08.146792 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7755d164-f1c7-4f58-91d4-5ac4ab948090-config-data" (OuterVolumeSpecName: "config-data") pod "7755d164-f1c7-4f58-91d4-5ac4ab948090" (UID: "7755d164-f1c7-4f58-91d4-5ac4ab948090"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:08 crc kubenswrapper[4774]: I1003 15:01:08.147453 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7755d164-f1c7-4f58-91d4-5ac4ab948090-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7755d164-f1c7-4f58-91d4-5ac4ab948090" (UID: "7755d164-f1c7-4f58-91d4-5ac4ab948090"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:08 crc kubenswrapper[4774]: I1003 15:01:08.180664 4774 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7755d164-f1c7-4f58-91d4-5ac4ab948090-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:08 crc kubenswrapper[4774]: I1003 15:01:08.180693 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7755d164-f1c7-4f58-91d4-5ac4ab948090-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:08 crc kubenswrapper[4774]: I1003 15:01:08.180703 4774 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7755d164-f1c7-4f58-91d4-5ac4ab948090-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:08 crc kubenswrapper[4774]: I1003 15:01:08.180713 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7755d164-f1c7-4f58-91d4-5ac4ab948090-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:08 crc kubenswrapper[4774]: I1003 15:01:08.180721 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2x2gc\" (UniqueName: \"kubernetes.io/projected/7755d164-f1c7-4f58-91d4-5ac4ab948090-kube-api-access-2x2gc\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:08 crc kubenswrapper[4774]: I1003 15:01:08.180731 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7755d164-f1c7-4f58-91d4-5ac4ab948090-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:08 crc kubenswrapper[4774]: I1003 15:01:08.180819 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-w7hcx" event={"ID":"7755d164-f1c7-4f58-91d4-5ac4ab948090","Type":"ContainerDied","Data":"23a9b1d970a7625eee219ff58409756f5478de09235140d077fde17e856cae8d"} Oct 03 15:01:08 crc kubenswrapper[4774]: I1003 15:01:08.180846 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23a9b1d970a7625eee219ff58409756f5478de09235140d077fde17e856cae8d" Oct 03 15:01:08 crc kubenswrapper[4774]: I1003 15:01:08.180896 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-w7hcx" Oct 03 15:01:08 crc kubenswrapper[4774]: I1003 15:01:08.403679 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-bc86db558-frxdt"] Oct 03 15:01:08 crc kubenswrapper[4774]: W1003 15:01:08.408840 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode952bb63_3c66_43a3_a8ef_34e636f1b400.slice/crio-9455855e75d27bef7e083585c81e8fcba3ca3375075c144678ffd94de9881484 WatchSource:0}: Error finding container 9455855e75d27bef7e083585c81e8fcba3ca3375075c144678ffd94de9881484: Status 404 returned error can't find the container with id 9455855e75d27bef7e083585c81e8fcba3ca3375075c144678ffd94de9881484 Oct 03 15:01:08 crc kubenswrapper[4774]: I1003 15:01:08.530998 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-gp26k"] Oct 03 15:01:08 crc kubenswrapper[4774]: I1003 15:01:08.539798 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-66tpj"] Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.125782 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7444dd849d-z82k5"] Oct 03 15:01:09 crc kubenswrapper[4774]: E1003 15:01:09.126467 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7755d164-f1c7-4f58-91d4-5ac4ab948090" containerName="keystone-bootstrap" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.126484 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="7755d164-f1c7-4f58-91d4-5ac4ab948090" containerName="keystone-bootstrap" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.126687 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="7755d164-f1c7-4f58-91d4-5ac4ab948090" containerName="keystone-bootstrap" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.127207 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7444dd849d-z82k5" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.132756 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.132909 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.133005 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-5rssc" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.133050 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.133014 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.150021 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.151898 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7444dd849d-z82k5"] Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.199647 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9be0f44b-e4c6-475d-976b-d0b30b456b9c-internal-tls-certs\") pod \"keystone-7444dd849d-z82k5\" (UID: \"9be0f44b-e4c6-475d-976b-d0b30b456b9c\") " pod="openstack/keystone-7444dd849d-z82k5" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.199725 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9be0f44b-e4c6-475d-976b-d0b30b456b9c-config-data\") pod \"keystone-7444dd849d-z82k5\" (UID: \"9be0f44b-e4c6-475d-976b-d0b30b456b9c\") " pod="openstack/keystone-7444dd849d-z82k5" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.199778 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9be0f44b-e4c6-475d-976b-d0b30b456b9c-public-tls-certs\") pod \"keystone-7444dd849d-z82k5\" (UID: \"9be0f44b-e4c6-475d-976b-d0b30b456b9c\") " pod="openstack/keystone-7444dd849d-z82k5" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.199902 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9be0f44b-e4c6-475d-976b-d0b30b456b9c-fernet-keys\") pod \"keystone-7444dd849d-z82k5\" (UID: \"9be0f44b-e4c6-475d-976b-d0b30b456b9c\") " pod="openstack/keystone-7444dd849d-z82k5" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.199972 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9be0f44b-e4c6-475d-976b-d0b30b456b9c-scripts\") pod \"keystone-7444dd849d-z82k5\" (UID: \"9be0f44b-e4c6-475d-976b-d0b30b456b9c\") " pod="openstack/keystone-7444dd849d-z82k5" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.200062 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9be0f44b-e4c6-475d-976b-d0b30b456b9c-credential-keys\") pod \"keystone-7444dd849d-z82k5\" (UID: \"9be0f44b-e4c6-475d-976b-d0b30b456b9c\") " pod="openstack/keystone-7444dd849d-z82k5" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.200094 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xndc2\" (UniqueName: \"kubernetes.io/projected/9be0f44b-e4c6-475d-976b-d0b30b456b9c-kube-api-access-xndc2\") pod \"keystone-7444dd849d-z82k5\" (UID: \"9be0f44b-e4c6-475d-976b-d0b30b456b9c\") " pod="openstack/keystone-7444dd849d-z82k5" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.200162 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9be0f44b-e4c6-475d-976b-d0b30b456b9c-combined-ca-bundle\") pod \"keystone-7444dd849d-z82k5\" (UID: \"9be0f44b-e4c6-475d-976b-d0b30b456b9c\") " pod="openstack/keystone-7444dd849d-z82k5" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.224548 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gp26k" event={"ID":"29f3857a-d03c-48ab-93c4-0d75fc497c0e","Type":"ContainerStarted","Data":"16011eb89c43902ef7360101893272d6a5850d977c04fa1eda85af03db0c9c1c"} Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.237581 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-66tpj" event={"ID":"cdc24810-0778-4b37-8156-ecac9ae9e077","Type":"ContainerStarted","Data":"ce201ca2e39eab9f9957879b76b9964af29817ef6a45ec70c4b43086049a53a1"} Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.237635 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-66tpj" event={"ID":"cdc24810-0778-4b37-8156-ecac9ae9e077","Type":"ContainerStarted","Data":"6cfd5d5a159c06119132c546975544a01f9146a307480907fd8f623e2db3fb1c"} Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.247502 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-bc86db558-frxdt" event={"ID":"e952bb63-3c66-43a3-a8ef-34e636f1b400","Type":"ContainerStarted","Data":"1ec5b501203daf5a8823dfad024e5610c1456a0bc6411e71043755848096f4d3"} Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.247612 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-bc86db558-frxdt" event={"ID":"e952bb63-3c66-43a3-a8ef-34e636f1b400","Type":"ContainerStarted","Data":"15685602762b7e6999b169bedc01bc118e73fbe17ed2843fb3da2679dcbcc8b0"} Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.247626 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-bc86db558-frxdt" event={"ID":"e952bb63-3c66-43a3-a8ef-34e636f1b400","Type":"ContainerStarted","Data":"9455855e75d27bef7e083585c81e8fcba3ca3375075c144678ffd94de9881484"} Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.248450 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-bc86db558-frxdt" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.248485 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-bc86db558-frxdt" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.257161 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9e41eee-4655-4cc2-b01d-37d1f947011b","Type":"ContainerStarted","Data":"ed4e3b3350eaeb9055c50f98a8537414d0624b18969aab216e1ab174f748523a"} Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.274720 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-66tpj" podStartSLOduration=3.2747006450000002 podStartE2EDuration="3.274700645s" podCreationTimestamp="2025-10-03 15:01:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:01:09.265473835 +0000 UTC m=+1091.854677297" watchObservedRunningTime="2025-10-03 15:01:09.274700645 +0000 UTC m=+1091.863904097" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.289122 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-bc86db558-frxdt" podStartSLOduration=7.289103524 podStartE2EDuration="7.289103524s" podCreationTimestamp="2025-10-03 15:01:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:01:09.281388221 +0000 UTC m=+1091.870591683" watchObservedRunningTime="2025-10-03 15:01:09.289103524 +0000 UTC m=+1091.878306976" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.301306 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9be0f44b-e4c6-475d-976b-d0b30b456b9c-fernet-keys\") pod \"keystone-7444dd849d-z82k5\" (UID: \"9be0f44b-e4c6-475d-976b-d0b30b456b9c\") " pod="openstack/keystone-7444dd849d-z82k5" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.301349 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9be0f44b-e4c6-475d-976b-d0b30b456b9c-scripts\") pod \"keystone-7444dd849d-z82k5\" (UID: \"9be0f44b-e4c6-475d-976b-d0b30b456b9c\") " pod="openstack/keystone-7444dd849d-z82k5" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.301391 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9be0f44b-e4c6-475d-976b-d0b30b456b9c-credential-keys\") pod \"keystone-7444dd849d-z82k5\" (UID: \"9be0f44b-e4c6-475d-976b-d0b30b456b9c\") " pod="openstack/keystone-7444dd849d-z82k5" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.301433 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xndc2\" (UniqueName: \"kubernetes.io/projected/9be0f44b-e4c6-475d-976b-d0b30b456b9c-kube-api-access-xndc2\") pod \"keystone-7444dd849d-z82k5\" (UID: \"9be0f44b-e4c6-475d-976b-d0b30b456b9c\") " pod="openstack/keystone-7444dd849d-z82k5" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.301474 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9be0f44b-e4c6-475d-976b-d0b30b456b9c-combined-ca-bundle\") pod \"keystone-7444dd849d-z82k5\" (UID: \"9be0f44b-e4c6-475d-976b-d0b30b456b9c\") " pod="openstack/keystone-7444dd849d-z82k5" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.301518 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9be0f44b-e4c6-475d-976b-d0b30b456b9c-internal-tls-certs\") pod \"keystone-7444dd849d-z82k5\" (UID: \"9be0f44b-e4c6-475d-976b-d0b30b456b9c\") " pod="openstack/keystone-7444dd849d-z82k5" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.301589 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9be0f44b-e4c6-475d-976b-d0b30b456b9c-config-data\") pod \"keystone-7444dd849d-z82k5\" (UID: \"9be0f44b-e4c6-475d-976b-d0b30b456b9c\") " pod="openstack/keystone-7444dd849d-z82k5" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.301619 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9be0f44b-e4c6-475d-976b-d0b30b456b9c-public-tls-certs\") pod \"keystone-7444dd849d-z82k5\" (UID: \"9be0f44b-e4c6-475d-976b-d0b30b456b9c\") " pod="openstack/keystone-7444dd849d-z82k5" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.307785 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9be0f44b-e4c6-475d-976b-d0b30b456b9c-public-tls-certs\") pod \"keystone-7444dd849d-z82k5\" (UID: \"9be0f44b-e4c6-475d-976b-d0b30b456b9c\") " pod="openstack/keystone-7444dd849d-z82k5" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.308328 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9be0f44b-e4c6-475d-976b-d0b30b456b9c-config-data\") pod \"keystone-7444dd849d-z82k5\" (UID: \"9be0f44b-e4c6-475d-976b-d0b30b456b9c\") " pod="openstack/keystone-7444dd849d-z82k5" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.308500 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9be0f44b-e4c6-475d-976b-d0b30b456b9c-credential-keys\") pod \"keystone-7444dd849d-z82k5\" (UID: \"9be0f44b-e4c6-475d-976b-d0b30b456b9c\") " pod="openstack/keystone-7444dd849d-z82k5" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.310220 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9be0f44b-e4c6-475d-976b-d0b30b456b9c-internal-tls-certs\") pod \"keystone-7444dd849d-z82k5\" (UID: \"9be0f44b-e4c6-475d-976b-d0b30b456b9c\") " pod="openstack/keystone-7444dd849d-z82k5" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.310253 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9be0f44b-e4c6-475d-976b-d0b30b456b9c-scripts\") pod \"keystone-7444dd849d-z82k5\" (UID: \"9be0f44b-e4c6-475d-976b-d0b30b456b9c\") " pod="openstack/keystone-7444dd849d-z82k5" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.311529 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9be0f44b-e4c6-475d-976b-d0b30b456b9c-fernet-keys\") pod \"keystone-7444dd849d-z82k5\" (UID: \"9be0f44b-e4c6-475d-976b-d0b30b456b9c\") " pod="openstack/keystone-7444dd849d-z82k5" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.343497 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9be0f44b-e4c6-475d-976b-d0b30b456b9c-combined-ca-bundle\") pod \"keystone-7444dd849d-z82k5\" (UID: \"9be0f44b-e4c6-475d-976b-d0b30b456b9c\") " pod="openstack/keystone-7444dd849d-z82k5" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.348516 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xndc2\" (UniqueName: \"kubernetes.io/projected/9be0f44b-e4c6-475d-976b-d0b30b456b9c-kube-api-access-xndc2\") pod \"keystone-7444dd849d-z82k5\" (UID: \"9be0f44b-e4c6-475d-976b-d0b30b456b9c\") " pod="openstack/keystone-7444dd849d-z82k5" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.450075 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7444dd849d-z82k5" Oct 03 15:01:09 crc kubenswrapper[4774]: I1003 15:01:09.868913 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7444dd849d-z82k5"] Oct 03 15:01:10 crc kubenswrapper[4774]: I1003 15:01:10.266879 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7444dd849d-z82k5" event={"ID":"9be0f44b-e4c6-475d-976b-d0b30b456b9c","Type":"ContainerStarted","Data":"555e91b1fda4364584fd82338f8118eb0adddfbbcc29b1af1376c6aa4eb9e7dc"} Oct 03 15:01:11 crc kubenswrapper[4774]: I1003 15:01:11.275736 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7444dd849d-z82k5" event={"ID":"9be0f44b-e4c6-475d-976b-d0b30b456b9c","Type":"ContainerStarted","Data":"3db74268f9fb7fdcafe5a700a6da099aa58e3dae0b4d6d80cc80c77be9c120bf"} Oct 03 15:01:11 crc kubenswrapper[4774]: I1003 15:01:11.615634 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-bc5bdf456-xt2x4" podUID="871f7d16-54b6-4aa9-8e99-00a888d41f70" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Oct 03 15:01:11 crc kubenswrapper[4774]: I1003 15:01:11.707785 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-f865bb968-k9r7v" podUID="c0b4826d-75e2-4023-8d53-3ddd0da5bc2e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Oct 03 15:01:12 crc kubenswrapper[4774]: I1003 15:01:12.283677 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7444dd849d-z82k5" Oct 03 15:01:12 crc kubenswrapper[4774]: I1003 15:01:12.304981 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7444dd849d-z82k5" podStartSLOduration=3.304963315 podStartE2EDuration="3.304963315s" podCreationTimestamp="2025-10-03 15:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:01:12.298216287 +0000 UTC m=+1094.887419739" watchObservedRunningTime="2025-10-03 15:01:12.304963315 +0000 UTC m=+1094.894166767" Oct 03 15:01:20 crc kubenswrapper[4774]: E1003 15:01:20.399098 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="d9e41eee-4655-4cc2-b01d-37d1f947011b" Oct 03 15:01:21 crc kubenswrapper[4774]: I1003 15:01:21.369524 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gp26k" event={"ID":"29f3857a-d03c-48ab-93c4-0d75fc497c0e","Type":"ContainerStarted","Data":"3721a4b95ab469a9fef4cef5499af290fd97dc2313b7266d4514c7d233652a61"} Oct 03 15:01:21 crc kubenswrapper[4774]: I1003 15:01:21.372435 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-sx6cl" event={"ID":"be96775d-6115-4c8e-8539-230de2424b0e","Type":"ContainerStarted","Data":"3e491844db25d065fe046339b9812d693015a5f1b6064363a997ecb709ecf30d"} Oct 03 15:01:21 crc kubenswrapper[4774]: I1003 15:01:21.374495 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9e41eee-4655-4cc2-b01d-37d1f947011b","Type":"ContainerStarted","Data":"98b9a01eac0717c48a04feff10917773885c7865c780bf9362a382f4effe7ca8"} Oct 03 15:01:21 crc kubenswrapper[4774]: I1003 15:01:21.374627 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d9e41eee-4655-4cc2-b01d-37d1f947011b" containerName="ceilometer-notification-agent" containerID="cri-o://243f3d3655fd1960cd360dac4b6b275a2de3079b6eb4a7aa1a8e22aa14983188" gracePeriod=30 Oct 03 15:01:21 crc kubenswrapper[4774]: I1003 15:01:21.374666 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 15:01:21 crc kubenswrapper[4774]: I1003 15:01:21.374722 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d9e41eee-4655-4cc2-b01d-37d1f947011b" containerName="sg-core" containerID="cri-o://ed4e3b3350eaeb9055c50f98a8537414d0624b18969aab216e1ab174f748523a" gracePeriod=30 Oct 03 15:01:21 crc kubenswrapper[4774]: I1003 15:01:21.374715 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d9e41eee-4655-4cc2-b01d-37d1f947011b" containerName="proxy-httpd" containerID="cri-o://98b9a01eac0717c48a04feff10917773885c7865c780bf9362a382f4effe7ca8" gracePeriod=30 Oct 03 15:01:21 crc kubenswrapper[4774]: I1003 15:01:21.393714 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-gp26k" podStartSLOduration=3.924561149 podStartE2EDuration="15.393696083s" podCreationTimestamp="2025-10-03 15:01:06 +0000 UTC" firstStartedPulling="2025-10-03 15:01:08.547413916 +0000 UTC m=+1091.136617358" lastFinishedPulling="2025-10-03 15:01:20.01654884 +0000 UTC m=+1102.605752292" observedRunningTime="2025-10-03 15:01:21.386019562 +0000 UTC m=+1103.975223014" watchObservedRunningTime="2025-10-03 15:01:21.393696083 +0000 UTC m=+1103.982899535" Oct 03 15:01:21 crc kubenswrapper[4774]: I1003 15:01:21.427765 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-sx6cl" podStartSLOduration=2.350420325 podStartE2EDuration="50.427747537s" podCreationTimestamp="2025-10-03 15:00:31 +0000 UTC" firstStartedPulling="2025-10-03 15:00:32.073535627 +0000 UTC m=+1054.662739079" lastFinishedPulling="2025-10-03 15:01:20.150862839 +0000 UTC m=+1102.740066291" observedRunningTime="2025-10-03 15:01:21.420709922 +0000 UTC m=+1104.009913374" watchObservedRunningTime="2025-10-03 15:01:21.427747537 +0000 UTC m=+1104.016950989" Oct 03 15:01:22 crc kubenswrapper[4774]: I1003 15:01:22.388908 4774 generic.go:334] "Generic (PLEG): container finished" podID="d9e41eee-4655-4cc2-b01d-37d1f947011b" containerID="98b9a01eac0717c48a04feff10917773885c7865c780bf9362a382f4effe7ca8" exitCode=0 Oct 03 15:01:22 crc kubenswrapper[4774]: I1003 15:01:22.389633 4774 generic.go:334] "Generic (PLEG): container finished" podID="d9e41eee-4655-4cc2-b01d-37d1f947011b" containerID="ed4e3b3350eaeb9055c50f98a8537414d0624b18969aab216e1ab174f748523a" exitCode=2 Oct 03 15:01:22 crc kubenswrapper[4774]: I1003 15:01:22.389648 4774 generic.go:334] "Generic (PLEG): container finished" podID="d9e41eee-4655-4cc2-b01d-37d1f947011b" containerID="243f3d3655fd1960cd360dac4b6b275a2de3079b6eb4a7aa1a8e22aa14983188" exitCode=0 Oct 03 15:01:22 crc kubenswrapper[4774]: I1003 15:01:22.389103 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9e41eee-4655-4cc2-b01d-37d1f947011b","Type":"ContainerDied","Data":"98b9a01eac0717c48a04feff10917773885c7865c780bf9362a382f4effe7ca8"} Oct 03 15:01:22 crc kubenswrapper[4774]: I1003 15:01:22.389909 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9e41eee-4655-4cc2-b01d-37d1f947011b","Type":"ContainerDied","Data":"ed4e3b3350eaeb9055c50f98a8537414d0624b18969aab216e1ab174f748523a"} Oct 03 15:01:22 crc kubenswrapper[4774]: I1003 15:01:22.389921 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9e41eee-4655-4cc2-b01d-37d1f947011b","Type":"ContainerDied","Data":"243f3d3655fd1960cd360dac4b6b275a2de3079b6eb4a7aa1a8e22aa14983188"} Oct 03 15:01:22 crc kubenswrapper[4774]: I1003 15:01:22.389931 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9e41eee-4655-4cc2-b01d-37d1f947011b","Type":"ContainerDied","Data":"74d32d7dea157a878329b28698540bb86ee7abbfda097b29ed0ec979532f70b3"} Oct 03 15:01:22 crc kubenswrapper[4774]: I1003 15:01:22.389942 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74d32d7dea157a878329b28698540bb86ee7abbfda097b29ed0ec979532f70b3" Oct 03 15:01:22 crc kubenswrapper[4774]: I1003 15:01:22.470141 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 15:01:22 crc kubenswrapper[4774]: I1003 15:01:22.534111 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9e41eee-4655-4cc2-b01d-37d1f947011b-scripts\") pod \"d9e41eee-4655-4cc2-b01d-37d1f947011b\" (UID: \"d9e41eee-4655-4cc2-b01d-37d1f947011b\") " Oct 03 15:01:22 crc kubenswrapper[4774]: I1003 15:01:22.534181 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9e41eee-4655-4cc2-b01d-37d1f947011b-run-httpd\") pod \"d9e41eee-4655-4cc2-b01d-37d1f947011b\" (UID: \"d9e41eee-4655-4cc2-b01d-37d1f947011b\") " Oct 03 15:01:22 crc kubenswrapper[4774]: I1003 15:01:22.534237 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9e41eee-4655-4cc2-b01d-37d1f947011b-combined-ca-bundle\") pod \"d9e41eee-4655-4cc2-b01d-37d1f947011b\" (UID: \"d9e41eee-4655-4cc2-b01d-37d1f947011b\") " Oct 03 15:01:22 crc kubenswrapper[4774]: I1003 15:01:22.534292 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9e41eee-4655-4cc2-b01d-37d1f947011b-config-data\") pod \"d9e41eee-4655-4cc2-b01d-37d1f947011b\" (UID: \"d9e41eee-4655-4cc2-b01d-37d1f947011b\") " Oct 03 15:01:22 crc kubenswrapper[4774]: I1003 15:01:22.534363 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9e41eee-4655-4cc2-b01d-37d1f947011b-sg-core-conf-yaml\") pod \"d9e41eee-4655-4cc2-b01d-37d1f947011b\" (UID: \"d9e41eee-4655-4cc2-b01d-37d1f947011b\") " Oct 03 15:01:22 crc kubenswrapper[4774]: I1003 15:01:22.534509 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9e41eee-4655-4cc2-b01d-37d1f947011b-log-httpd\") pod \"d9e41eee-4655-4cc2-b01d-37d1f947011b\" (UID: \"d9e41eee-4655-4cc2-b01d-37d1f947011b\") " Oct 03 15:01:22 crc kubenswrapper[4774]: I1003 15:01:22.534644 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmndq\" (UniqueName: \"kubernetes.io/projected/d9e41eee-4655-4cc2-b01d-37d1f947011b-kube-api-access-fmndq\") pod \"d9e41eee-4655-4cc2-b01d-37d1f947011b\" (UID: \"d9e41eee-4655-4cc2-b01d-37d1f947011b\") " Oct 03 15:01:22 crc kubenswrapper[4774]: I1003 15:01:22.534648 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9e41eee-4655-4cc2-b01d-37d1f947011b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d9e41eee-4655-4cc2-b01d-37d1f947011b" (UID: "d9e41eee-4655-4cc2-b01d-37d1f947011b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:01:22 crc kubenswrapper[4774]: I1003 15:01:22.535061 4774 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9e41eee-4655-4cc2-b01d-37d1f947011b-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:22 crc kubenswrapper[4774]: I1003 15:01:22.535483 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9e41eee-4655-4cc2-b01d-37d1f947011b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d9e41eee-4655-4cc2-b01d-37d1f947011b" (UID: "d9e41eee-4655-4cc2-b01d-37d1f947011b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:01:22 crc kubenswrapper[4774]: I1003 15:01:22.540330 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9e41eee-4655-4cc2-b01d-37d1f947011b-scripts" (OuterVolumeSpecName: "scripts") pod "d9e41eee-4655-4cc2-b01d-37d1f947011b" (UID: "d9e41eee-4655-4cc2-b01d-37d1f947011b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:22 crc kubenswrapper[4774]: I1003 15:01:22.540522 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9e41eee-4655-4cc2-b01d-37d1f947011b-kube-api-access-fmndq" (OuterVolumeSpecName: "kube-api-access-fmndq") pod "d9e41eee-4655-4cc2-b01d-37d1f947011b" (UID: "d9e41eee-4655-4cc2-b01d-37d1f947011b"). InnerVolumeSpecName "kube-api-access-fmndq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:01:22 crc kubenswrapper[4774]: I1003 15:01:22.575514 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9e41eee-4655-4cc2-b01d-37d1f947011b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d9e41eee-4655-4cc2-b01d-37d1f947011b" (UID: "d9e41eee-4655-4cc2-b01d-37d1f947011b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:22 crc kubenswrapper[4774]: I1003 15:01:22.591597 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9e41eee-4655-4cc2-b01d-37d1f947011b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9e41eee-4655-4cc2-b01d-37d1f947011b" (UID: "d9e41eee-4655-4cc2-b01d-37d1f947011b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:22 crc kubenswrapper[4774]: I1003 15:01:22.628458 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9e41eee-4655-4cc2-b01d-37d1f947011b-config-data" (OuterVolumeSpecName: "config-data") pod "d9e41eee-4655-4cc2-b01d-37d1f947011b" (UID: "d9e41eee-4655-4cc2-b01d-37d1f947011b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:22 crc kubenswrapper[4774]: I1003 15:01:22.636765 4774 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9e41eee-4655-4cc2-b01d-37d1f947011b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:22 crc kubenswrapper[4774]: I1003 15:01:22.636805 4774 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9e41eee-4655-4cc2-b01d-37d1f947011b-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:22 crc kubenswrapper[4774]: I1003 15:01:22.636819 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmndq\" (UniqueName: \"kubernetes.io/projected/d9e41eee-4655-4cc2-b01d-37d1f947011b-kube-api-access-fmndq\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:22 crc kubenswrapper[4774]: I1003 15:01:22.636834 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9e41eee-4655-4cc2-b01d-37d1f947011b-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:22 crc kubenswrapper[4774]: I1003 15:01:22.636847 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9e41eee-4655-4cc2-b01d-37d1f947011b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:22 crc kubenswrapper[4774]: I1003 15:01:22.636858 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9e41eee-4655-4cc2-b01d-37d1f947011b-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.398634 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.449237 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.457762 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.485156 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:01:23 crc kubenswrapper[4774]: E1003 15:01:23.486627 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9e41eee-4655-4cc2-b01d-37d1f947011b" containerName="ceilometer-notification-agent" Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.486795 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9e41eee-4655-4cc2-b01d-37d1f947011b" containerName="ceilometer-notification-agent" Oct 03 15:01:23 crc kubenswrapper[4774]: E1003 15:01:23.486924 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9e41eee-4655-4cc2-b01d-37d1f947011b" containerName="sg-core" Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.487040 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9e41eee-4655-4cc2-b01d-37d1f947011b" containerName="sg-core" Oct 03 15:01:23 crc kubenswrapper[4774]: E1003 15:01:23.487155 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9e41eee-4655-4cc2-b01d-37d1f947011b" containerName="proxy-httpd" Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.487269 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9e41eee-4655-4cc2-b01d-37d1f947011b" containerName="proxy-httpd" Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.487713 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9e41eee-4655-4cc2-b01d-37d1f947011b" containerName="proxy-httpd" Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.487877 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9e41eee-4655-4cc2-b01d-37d1f947011b" containerName="sg-core" Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.488021 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9e41eee-4655-4cc2-b01d-37d1f947011b" containerName="ceilometer-notification-agent" Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.497857 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.497986 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.500771 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.501036 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.566663 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-bc5bdf456-xt2x4" Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.657160 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c51760f-5a26-453e-b578-3bc16d784a4a-config-data\") pod \"ceilometer-0\" (UID: \"5c51760f-5a26-453e-b578-3bc16d784a4a\") " pod="openstack/ceilometer-0" Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.657206 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c51760f-5a26-453e-b578-3bc16d784a4a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5c51760f-5a26-453e-b578-3bc16d784a4a\") " pod="openstack/ceilometer-0" Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.657430 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr9w4\" (UniqueName: \"kubernetes.io/projected/5c51760f-5a26-453e-b578-3bc16d784a4a-kube-api-access-kr9w4\") pod \"ceilometer-0\" (UID: \"5c51760f-5a26-453e-b578-3bc16d784a4a\") " pod="openstack/ceilometer-0" Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.657563 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5c51760f-5a26-453e-b578-3bc16d784a4a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5c51760f-5a26-453e-b578-3bc16d784a4a\") " pod="openstack/ceilometer-0" Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.657589 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c51760f-5a26-453e-b578-3bc16d784a4a-log-httpd\") pod \"ceilometer-0\" (UID: \"5c51760f-5a26-453e-b578-3bc16d784a4a\") " pod="openstack/ceilometer-0" Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.657690 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c51760f-5a26-453e-b578-3bc16d784a4a-run-httpd\") pod \"ceilometer-0\" (UID: \"5c51760f-5a26-453e-b578-3bc16d784a4a\") " pod="openstack/ceilometer-0" Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.657744 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c51760f-5a26-453e-b578-3bc16d784a4a-scripts\") pod \"ceilometer-0\" (UID: \"5c51760f-5a26-453e-b578-3bc16d784a4a\") " pod="openstack/ceilometer-0" Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.697195 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-f865bb968-k9r7v" Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.759703 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5c51760f-5a26-453e-b578-3bc16d784a4a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5c51760f-5a26-453e-b578-3bc16d784a4a\") " pod="openstack/ceilometer-0" Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.759753 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c51760f-5a26-453e-b578-3bc16d784a4a-log-httpd\") pod \"ceilometer-0\" (UID: \"5c51760f-5a26-453e-b578-3bc16d784a4a\") " pod="openstack/ceilometer-0" Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.759814 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c51760f-5a26-453e-b578-3bc16d784a4a-run-httpd\") pod \"ceilometer-0\" (UID: \"5c51760f-5a26-453e-b578-3bc16d784a4a\") " pod="openstack/ceilometer-0" Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.760039 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c51760f-5a26-453e-b578-3bc16d784a4a-scripts\") pod \"ceilometer-0\" (UID: \"5c51760f-5a26-453e-b578-3bc16d784a4a\") " pod="openstack/ceilometer-0" Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.760191 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c51760f-5a26-453e-b578-3bc16d784a4a-config-data\") pod \"ceilometer-0\" (UID: \"5c51760f-5a26-453e-b578-3bc16d784a4a\") " pod="openstack/ceilometer-0" Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.760220 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c51760f-5a26-453e-b578-3bc16d784a4a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5c51760f-5a26-453e-b578-3bc16d784a4a\") " pod="openstack/ceilometer-0" Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.760253 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr9w4\" (UniqueName: \"kubernetes.io/projected/5c51760f-5a26-453e-b578-3bc16d784a4a-kube-api-access-kr9w4\") pod \"ceilometer-0\" (UID: \"5c51760f-5a26-453e-b578-3bc16d784a4a\") " pod="openstack/ceilometer-0" Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.760505 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c51760f-5a26-453e-b578-3bc16d784a4a-log-httpd\") pod \"ceilometer-0\" (UID: \"5c51760f-5a26-453e-b578-3bc16d784a4a\") " pod="openstack/ceilometer-0" Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.760510 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c51760f-5a26-453e-b578-3bc16d784a4a-run-httpd\") pod \"ceilometer-0\" (UID: \"5c51760f-5a26-453e-b578-3bc16d784a4a\") " pod="openstack/ceilometer-0" Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.764515 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5c51760f-5a26-453e-b578-3bc16d784a4a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5c51760f-5a26-453e-b578-3bc16d784a4a\") " pod="openstack/ceilometer-0" Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.765262 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c51760f-5a26-453e-b578-3bc16d784a4a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5c51760f-5a26-453e-b578-3bc16d784a4a\") " pod="openstack/ceilometer-0" Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.765741 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c51760f-5a26-453e-b578-3bc16d784a4a-scripts\") pod \"ceilometer-0\" (UID: \"5c51760f-5a26-453e-b578-3bc16d784a4a\") " pod="openstack/ceilometer-0" Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.784354 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c51760f-5a26-453e-b578-3bc16d784a4a-config-data\") pod \"ceilometer-0\" (UID: \"5c51760f-5a26-453e-b578-3bc16d784a4a\") " pod="openstack/ceilometer-0" Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.786815 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr9w4\" (UniqueName: \"kubernetes.io/projected/5c51760f-5a26-453e-b578-3bc16d784a4a-kube-api-access-kr9w4\") pod \"ceilometer-0\" (UID: \"5c51760f-5a26-453e-b578-3bc16d784a4a\") " pod="openstack/ceilometer-0" Oct 03 15:01:23 crc kubenswrapper[4774]: I1003 15:01:23.823285 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 15:01:24 crc kubenswrapper[4774]: I1003 15:01:24.364124 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:01:24 crc kubenswrapper[4774]: W1003 15:01:24.364188 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c51760f_5a26_453e_b578_3bc16d784a4a.slice/crio-bdcecca6063c86549888b1bc0fea207b9ea8997d51e258721403e8739a9133ad WatchSource:0}: Error finding container bdcecca6063c86549888b1bc0fea207b9ea8997d51e258721403e8739a9133ad: Status 404 returned error can't find the container with id bdcecca6063c86549888b1bc0fea207b9ea8997d51e258721403e8739a9133ad Oct 03 15:01:24 crc kubenswrapper[4774]: I1003 15:01:24.411449 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c51760f-5a26-453e-b578-3bc16d784a4a","Type":"ContainerStarted","Data":"bdcecca6063c86549888b1bc0fea207b9ea8997d51e258721403e8739a9133ad"} Oct 03 15:01:24 crc kubenswrapper[4774]: I1003 15:01:24.413758 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gp26k" event={"ID":"29f3857a-d03c-48ab-93c4-0d75fc497c0e","Type":"ContainerDied","Data":"3721a4b95ab469a9fef4cef5499af290fd97dc2313b7266d4514c7d233652a61"} Oct 03 15:01:24 crc kubenswrapper[4774]: I1003 15:01:24.413708 4774 generic.go:334] "Generic (PLEG): container finished" podID="29f3857a-d03c-48ab-93c4-0d75fc497c0e" containerID="3721a4b95ab469a9fef4cef5499af290fd97dc2313b7266d4514c7d233652a61" exitCode=0 Oct 03 15:01:25 crc kubenswrapper[4774]: I1003 15:01:25.266500 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-bc5bdf456-xt2x4" Oct 03 15:01:25 crc kubenswrapper[4774]: I1003 15:01:25.328640 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9e41eee-4655-4cc2-b01d-37d1f947011b" path="/var/lib/kubelet/pods/d9e41eee-4655-4cc2-b01d-37d1f947011b/volumes" Oct 03 15:01:25 crc kubenswrapper[4774]: I1003 15:01:25.409726 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-f865bb968-k9r7v" Oct 03 15:01:25 crc kubenswrapper[4774]: I1003 15:01:25.469439 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-bc5bdf456-xt2x4"] Oct 03 15:01:25 crc kubenswrapper[4774]: I1003 15:01:25.471672 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-bc5bdf456-xt2x4" podUID="871f7d16-54b6-4aa9-8e99-00a888d41f70" containerName="horizon-log" containerID="cri-o://4c3807337ee7f2b8e44c4122821db338ac04f20ba3e7962936f507b835bfe6b9" gracePeriod=30 Oct 03 15:01:25 crc kubenswrapper[4774]: I1003 15:01:25.472129 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-bc5bdf456-xt2x4" podUID="871f7d16-54b6-4aa9-8e99-00a888d41f70" containerName="horizon" containerID="cri-o://af677314a00ddf8adf8d016fd4a1bdce9452562c501a73facceeff916c357b0b" gracePeriod=30 Oct 03 15:01:25 crc kubenswrapper[4774]: I1003 15:01:25.793743 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gp26k" Oct 03 15:01:25 crc kubenswrapper[4774]: I1003 15:01:25.897109 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29f3857a-d03c-48ab-93c4-0d75fc497c0e-db-sync-config-data\") pod \"29f3857a-d03c-48ab-93c4-0d75fc497c0e\" (UID: \"29f3857a-d03c-48ab-93c4-0d75fc497c0e\") " Oct 03 15:01:25 crc kubenswrapper[4774]: I1003 15:01:25.897877 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8sr2\" (UniqueName: \"kubernetes.io/projected/29f3857a-d03c-48ab-93c4-0d75fc497c0e-kube-api-access-v8sr2\") pod \"29f3857a-d03c-48ab-93c4-0d75fc497c0e\" (UID: \"29f3857a-d03c-48ab-93c4-0d75fc497c0e\") " Oct 03 15:01:25 crc kubenswrapper[4774]: I1003 15:01:25.897959 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29f3857a-d03c-48ab-93c4-0d75fc497c0e-combined-ca-bundle\") pod \"29f3857a-d03c-48ab-93c4-0d75fc497c0e\" (UID: \"29f3857a-d03c-48ab-93c4-0d75fc497c0e\") " Oct 03 15:01:25 crc kubenswrapper[4774]: I1003 15:01:25.901820 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29f3857a-d03c-48ab-93c4-0d75fc497c0e-kube-api-access-v8sr2" (OuterVolumeSpecName: "kube-api-access-v8sr2") pod "29f3857a-d03c-48ab-93c4-0d75fc497c0e" (UID: "29f3857a-d03c-48ab-93c4-0d75fc497c0e"). InnerVolumeSpecName "kube-api-access-v8sr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:01:25 crc kubenswrapper[4774]: I1003 15:01:25.904639 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29f3857a-d03c-48ab-93c4-0d75fc497c0e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "29f3857a-d03c-48ab-93c4-0d75fc497c0e" (UID: "29f3857a-d03c-48ab-93c4-0d75fc497c0e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:25 crc kubenswrapper[4774]: I1003 15:01:25.923768 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29f3857a-d03c-48ab-93c4-0d75fc497c0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29f3857a-d03c-48ab-93c4-0d75fc497c0e" (UID: "29f3857a-d03c-48ab-93c4-0d75fc497c0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.000519 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29f3857a-d03c-48ab-93c4-0d75fc497c0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.000551 4774 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29f3857a-d03c-48ab-93c4-0d75fc497c0e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.000564 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8sr2\" (UniqueName: \"kubernetes.io/projected/29f3857a-d03c-48ab-93c4-0d75fc497c0e-kube-api-access-v8sr2\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.431829 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c51760f-5a26-453e-b578-3bc16d784a4a","Type":"ContainerStarted","Data":"2ab27bcf9d1600d0ab6f3b362911893a98a6cf7d2181a8b461f383fbca930634"} Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.433058 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gp26k" event={"ID":"29f3857a-d03c-48ab-93c4-0d75fc497c0e","Type":"ContainerDied","Data":"16011eb89c43902ef7360101893272d6a5850d977c04fa1eda85af03db0c9c1c"} Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.433078 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16011eb89c43902ef7360101893272d6a5850d977c04fa1eda85af03db0c9c1c" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.433119 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gp26k" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.705232 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5b679cb88b-5lrzw"] Oct 03 15:01:26 crc kubenswrapper[4774]: E1003 15:01:26.713122 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29f3857a-d03c-48ab-93c4-0d75fc497c0e" containerName="barbican-db-sync" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.713139 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="29f3857a-d03c-48ab-93c4-0d75fc497c0e" containerName="barbican-db-sync" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.713325 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="29f3857a-d03c-48ab-93c4-0d75fc497c0e" containerName="barbican-db-sync" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.714209 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5b679cb88b-5lrzw" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.717970 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-5zx57" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.721826 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.722028 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.730594 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5b679cb88b-5lrzw"] Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.752786 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-74d9f6d95f-l289b"] Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.766300 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-74d9f6d95f-l289b" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.776000 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-74d9f6d95f-l289b"] Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.776230 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.817554 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ceaeb2a-3322-44b9-88ae-5c473721a68f-logs\") pod \"barbican-keystone-listener-5b679cb88b-5lrzw\" (UID: \"3ceaeb2a-3322-44b9-88ae-5c473721a68f\") " pod="openstack/barbican-keystone-listener-5b679cb88b-5lrzw" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.817604 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ceaeb2a-3322-44b9-88ae-5c473721a68f-config-data\") pod \"barbican-keystone-listener-5b679cb88b-5lrzw\" (UID: \"3ceaeb2a-3322-44b9-88ae-5c473721a68f\") " pod="openstack/barbican-keystone-listener-5b679cb88b-5lrzw" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.817661 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ceaeb2a-3322-44b9-88ae-5c473721a68f-combined-ca-bundle\") pod \"barbican-keystone-listener-5b679cb88b-5lrzw\" (UID: \"3ceaeb2a-3322-44b9-88ae-5c473721a68f\") " pod="openstack/barbican-keystone-listener-5b679cb88b-5lrzw" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.817720 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbxzz\" (UniqueName: \"kubernetes.io/projected/3ceaeb2a-3322-44b9-88ae-5c473721a68f-kube-api-access-bbxzz\") pod \"barbican-keystone-listener-5b679cb88b-5lrzw\" (UID: \"3ceaeb2a-3322-44b9-88ae-5c473721a68f\") " pod="openstack/barbican-keystone-listener-5b679cb88b-5lrzw" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.817849 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ceaeb2a-3322-44b9-88ae-5c473721a68f-config-data-custom\") pod \"barbican-keystone-listener-5b679cb88b-5lrzw\" (UID: \"3ceaeb2a-3322-44b9-88ae-5c473721a68f\") " pod="openstack/barbican-keystone-listener-5b679cb88b-5lrzw" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.817958 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-5rmz7"] Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.832862 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5ff467f-5rmz7" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.848428 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-5rmz7"] Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.913469 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-78d8f8854b-zdr5d"] Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.919300 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04d07033-b1e8-426d-828b-e78cb0f44294-config-data-custom\") pod \"barbican-worker-74d9f6d95f-l289b\" (UID: \"04d07033-b1e8-426d-828b-e78cb0f44294\") " pod="openstack/barbican-worker-74d9f6d95f-l289b" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.919359 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbxzz\" (UniqueName: \"kubernetes.io/projected/3ceaeb2a-3322-44b9-88ae-5c473721a68f-kube-api-access-bbxzz\") pod \"barbican-keystone-listener-5b679cb88b-5lrzw\" (UID: \"3ceaeb2a-3322-44b9-88ae-5c473721a68f\") " pod="openstack/barbican-keystone-listener-5b679cb88b-5lrzw" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.919420 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95263c08-34fe-4319-ae88-dc01b7609122-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5ff467f-5rmz7\" (UID: \"95263c08-34fe-4319-ae88-dc01b7609122\") " pod="openstack/dnsmasq-dns-59d5ff467f-5rmz7" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.923170 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/95263c08-34fe-4319-ae88-dc01b7609122-dns-swift-storage-0\") pod \"dnsmasq-dns-59d5ff467f-5rmz7\" (UID: \"95263c08-34fe-4319-ae88-dc01b7609122\") " pod="openstack/dnsmasq-dns-59d5ff467f-5rmz7" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.923253 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ceaeb2a-3322-44b9-88ae-5c473721a68f-config-data-custom\") pod \"barbican-keystone-listener-5b679cb88b-5lrzw\" (UID: \"3ceaeb2a-3322-44b9-88ae-5c473721a68f\") " pod="openstack/barbican-keystone-listener-5b679cb88b-5lrzw" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.923332 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04d07033-b1e8-426d-828b-e78cb0f44294-config-data\") pod \"barbican-worker-74d9f6d95f-l289b\" (UID: \"04d07033-b1e8-426d-828b-e78cb0f44294\") " pod="openstack/barbican-worker-74d9f6d95f-l289b" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.923387 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8jhm\" (UniqueName: \"kubernetes.io/projected/04d07033-b1e8-426d-828b-e78cb0f44294-kube-api-access-p8jhm\") pod \"barbican-worker-74d9f6d95f-l289b\" (UID: \"04d07033-b1e8-426d-828b-e78cb0f44294\") " pod="openstack/barbican-worker-74d9f6d95f-l289b" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.923458 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95263c08-34fe-4319-ae88-dc01b7609122-config\") pod \"dnsmasq-dns-59d5ff467f-5rmz7\" (UID: \"95263c08-34fe-4319-ae88-dc01b7609122\") " pod="openstack/dnsmasq-dns-59d5ff467f-5rmz7" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.923531 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04d07033-b1e8-426d-828b-e78cb0f44294-combined-ca-bundle\") pod \"barbican-worker-74d9f6d95f-l289b\" (UID: \"04d07033-b1e8-426d-828b-e78cb0f44294\") " pod="openstack/barbican-worker-74d9f6d95f-l289b" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.923573 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvpxj\" (UniqueName: \"kubernetes.io/projected/95263c08-34fe-4319-ae88-dc01b7609122-kube-api-access-vvpxj\") pod \"dnsmasq-dns-59d5ff467f-5rmz7\" (UID: \"95263c08-34fe-4319-ae88-dc01b7609122\") " pod="openstack/dnsmasq-dns-59d5ff467f-5rmz7" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.923602 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ceaeb2a-3322-44b9-88ae-5c473721a68f-logs\") pod \"barbican-keystone-listener-5b679cb88b-5lrzw\" (UID: \"3ceaeb2a-3322-44b9-88ae-5c473721a68f\") " pod="openstack/barbican-keystone-listener-5b679cb88b-5lrzw" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.923628 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ceaeb2a-3322-44b9-88ae-5c473721a68f-config-data\") pod \"barbican-keystone-listener-5b679cb88b-5lrzw\" (UID: \"3ceaeb2a-3322-44b9-88ae-5c473721a68f\") " pod="openstack/barbican-keystone-listener-5b679cb88b-5lrzw" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.923699 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95263c08-34fe-4319-ae88-dc01b7609122-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5ff467f-5rmz7\" (UID: \"95263c08-34fe-4319-ae88-dc01b7609122\") " pod="openstack/dnsmasq-dns-59d5ff467f-5rmz7" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.923735 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ceaeb2a-3322-44b9-88ae-5c473721a68f-combined-ca-bundle\") pod \"barbican-keystone-listener-5b679cb88b-5lrzw\" (UID: \"3ceaeb2a-3322-44b9-88ae-5c473721a68f\") " pod="openstack/barbican-keystone-listener-5b679cb88b-5lrzw" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.923786 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95263c08-34fe-4319-ae88-dc01b7609122-dns-svc\") pod \"dnsmasq-dns-59d5ff467f-5rmz7\" (UID: \"95263c08-34fe-4319-ae88-dc01b7609122\") " pod="openstack/dnsmasq-dns-59d5ff467f-5rmz7" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.923818 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04d07033-b1e8-426d-828b-e78cb0f44294-logs\") pod \"barbican-worker-74d9f6d95f-l289b\" (UID: \"04d07033-b1e8-426d-828b-e78cb0f44294\") " pod="openstack/barbican-worker-74d9f6d95f-l289b" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.925121 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ceaeb2a-3322-44b9-88ae-5c473721a68f-logs\") pod \"barbican-keystone-listener-5b679cb88b-5lrzw\" (UID: \"3ceaeb2a-3322-44b9-88ae-5c473721a68f\") " pod="openstack/barbican-keystone-listener-5b679cb88b-5lrzw" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.927831 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-78d8f8854b-zdr5d"] Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.927934 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-78d8f8854b-zdr5d" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.932032 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.940280 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ceaeb2a-3322-44b9-88ae-5c473721a68f-config-data\") pod \"barbican-keystone-listener-5b679cb88b-5lrzw\" (UID: \"3ceaeb2a-3322-44b9-88ae-5c473721a68f\") " pod="openstack/barbican-keystone-listener-5b679cb88b-5lrzw" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.945794 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbxzz\" (UniqueName: \"kubernetes.io/projected/3ceaeb2a-3322-44b9-88ae-5c473721a68f-kube-api-access-bbxzz\") pod \"barbican-keystone-listener-5b679cb88b-5lrzw\" (UID: \"3ceaeb2a-3322-44b9-88ae-5c473721a68f\") " pod="openstack/barbican-keystone-listener-5b679cb88b-5lrzw" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.965088 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ceaeb2a-3322-44b9-88ae-5c473721a68f-config-data-custom\") pod \"barbican-keystone-listener-5b679cb88b-5lrzw\" (UID: \"3ceaeb2a-3322-44b9-88ae-5c473721a68f\") " pod="openstack/barbican-keystone-listener-5b679cb88b-5lrzw" Oct 03 15:01:26 crc kubenswrapper[4774]: I1003 15:01:26.965114 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ceaeb2a-3322-44b9-88ae-5c473721a68f-combined-ca-bundle\") pod \"barbican-keystone-listener-5b679cb88b-5lrzw\" (UID: \"3ceaeb2a-3322-44b9-88ae-5c473721a68f\") " pod="openstack/barbican-keystone-listener-5b679cb88b-5lrzw" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.028299 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95263c08-34fe-4319-ae88-dc01b7609122-dns-svc\") pod \"dnsmasq-dns-59d5ff467f-5rmz7\" (UID: \"95263c08-34fe-4319-ae88-dc01b7609122\") " pod="openstack/dnsmasq-dns-59d5ff467f-5rmz7" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.028354 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04d07033-b1e8-426d-828b-e78cb0f44294-logs\") pod \"barbican-worker-74d9f6d95f-l289b\" (UID: \"04d07033-b1e8-426d-828b-e78cb0f44294\") " pod="openstack/barbican-worker-74d9f6d95f-l289b" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.028393 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04d07033-b1e8-426d-828b-e78cb0f44294-config-data-custom\") pod \"barbican-worker-74d9f6d95f-l289b\" (UID: \"04d07033-b1e8-426d-828b-e78cb0f44294\") " pod="openstack/barbican-worker-74d9f6d95f-l289b" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.028430 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95263c08-34fe-4319-ae88-dc01b7609122-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5ff467f-5rmz7\" (UID: \"95263c08-34fe-4319-ae88-dc01b7609122\") " pod="openstack/dnsmasq-dns-59d5ff467f-5rmz7" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.028461 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/95263c08-34fe-4319-ae88-dc01b7609122-dns-swift-storage-0\") pod \"dnsmasq-dns-59d5ff467f-5rmz7\" (UID: \"95263c08-34fe-4319-ae88-dc01b7609122\") " pod="openstack/dnsmasq-dns-59d5ff467f-5rmz7" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.028499 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04d07033-b1e8-426d-828b-e78cb0f44294-config-data\") pod \"barbican-worker-74d9f6d95f-l289b\" (UID: \"04d07033-b1e8-426d-828b-e78cb0f44294\") " pod="openstack/barbican-worker-74d9f6d95f-l289b" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.028538 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n8gh\" (UniqueName: \"kubernetes.io/projected/d926f2ba-fdb2-4eb0-a509-5b9eb13ed626-kube-api-access-5n8gh\") pod \"barbican-api-78d8f8854b-zdr5d\" (UID: \"d926f2ba-fdb2-4eb0-a509-5b9eb13ed626\") " pod="openstack/barbican-api-78d8f8854b-zdr5d" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.028555 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d926f2ba-fdb2-4eb0-a509-5b9eb13ed626-logs\") pod \"barbican-api-78d8f8854b-zdr5d\" (UID: \"d926f2ba-fdb2-4eb0-a509-5b9eb13ed626\") " pod="openstack/barbican-api-78d8f8854b-zdr5d" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.028570 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8jhm\" (UniqueName: \"kubernetes.io/projected/04d07033-b1e8-426d-828b-e78cb0f44294-kube-api-access-p8jhm\") pod \"barbican-worker-74d9f6d95f-l289b\" (UID: \"04d07033-b1e8-426d-828b-e78cb0f44294\") " pod="openstack/barbican-worker-74d9f6d95f-l289b" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.028589 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d926f2ba-fdb2-4eb0-a509-5b9eb13ed626-config-data\") pod \"barbican-api-78d8f8854b-zdr5d\" (UID: \"d926f2ba-fdb2-4eb0-a509-5b9eb13ed626\") " pod="openstack/barbican-api-78d8f8854b-zdr5d" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.028613 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d926f2ba-fdb2-4eb0-a509-5b9eb13ed626-combined-ca-bundle\") pod \"barbican-api-78d8f8854b-zdr5d\" (UID: \"d926f2ba-fdb2-4eb0-a509-5b9eb13ed626\") " pod="openstack/barbican-api-78d8f8854b-zdr5d" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.028632 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95263c08-34fe-4319-ae88-dc01b7609122-config\") pod \"dnsmasq-dns-59d5ff467f-5rmz7\" (UID: \"95263c08-34fe-4319-ae88-dc01b7609122\") " pod="openstack/dnsmasq-dns-59d5ff467f-5rmz7" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.028663 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04d07033-b1e8-426d-828b-e78cb0f44294-combined-ca-bundle\") pod \"barbican-worker-74d9f6d95f-l289b\" (UID: \"04d07033-b1e8-426d-828b-e78cb0f44294\") " pod="openstack/barbican-worker-74d9f6d95f-l289b" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.028685 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvpxj\" (UniqueName: \"kubernetes.io/projected/95263c08-34fe-4319-ae88-dc01b7609122-kube-api-access-vvpxj\") pod \"dnsmasq-dns-59d5ff467f-5rmz7\" (UID: \"95263c08-34fe-4319-ae88-dc01b7609122\") " pod="openstack/dnsmasq-dns-59d5ff467f-5rmz7" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.028715 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d926f2ba-fdb2-4eb0-a509-5b9eb13ed626-config-data-custom\") pod \"barbican-api-78d8f8854b-zdr5d\" (UID: \"d926f2ba-fdb2-4eb0-a509-5b9eb13ed626\") " pod="openstack/barbican-api-78d8f8854b-zdr5d" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.028739 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95263c08-34fe-4319-ae88-dc01b7609122-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5ff467f-5rmz7\" (UID: \"95263c08-34fe-4319-ae88-dc01b7609122\") " pod="openstack/dnsmasq-dns-59d5ff467f-5rmz7" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.029589 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95263c08-34fe-4319-ae88-dc01b7609122-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5ff467f-5rmz7\" (UID: \"95263c08-34fe-4319-ae88-dc01b7609122\") " pod="openstack/dnsmasq-dns-59d5ff467f-5rmz7" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.030108 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95263c08-34fe-4319-ae88-dc01b7609122-dns-svc\") pod \"dnsmasq-dns-59d5ff467f-5rmz7\" (UID: \"95263c08-34fe-4319-ae88-dc01b7609122\") " pod="openstack/dnsmasq-dns-59d5ff467f-5rmz7" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.030330 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04d07033-b1e8-426d-828b-e78cb0f44294-logs\") pod \"barbican-worker-74d9f6d95f-l289b\" (UID: \"04d07033-b1e8-426d-828b-e78cb0f44294\") " pod="openstack/barbican-worker-74d9f6d95f-l289b" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.037623 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95263c08-34fe-4319-ae88-dc01b7609122-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5ff467f-5rmz7\" (UID: \"95263c08-34fe-4319-ae88-dc01b7609122\") " pod="openstack/dnsmasq-dns-59d5ff467f-5rmz7" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.038281 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/95263c08-34fe-4319-ae88-dc01b7609122-dns-swift-storage-0\") pod \"dnsmasq-dns-59d5ff467f-5rmz7\" (UID: \"95263c08-34fe-4319-ae88-dc01b7609122\") " pod="openstack/dnsmasq-dns-59d5ff467f-5rmz7" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.038346 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95263c08-34fe-4319-ae88-dc01b7609122-config\") pod \"dnsmasq-dns-59d5ff467f-5rmz7\" (UID: \"95263c08-34fe-4319-ae88-dc01b7609122\") " pod="openstack/dnsmasq-dns-59d5ff467f-5rmz7" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.043278 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04d07033-b1e8-426d-828b-e78cb0f44294-config-data\") pod \"barbican-worker-74d9f6d95f-l289b\" (UID: \"04d07033-b1e8-426d-828b-e78cb0f44294\") " pod="openstack/barbican-worker-74d9f6d95f-l289b" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.051067 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04d07033-b1e8-426d-828b-e78cb0f44294-combined-ca-bundle\") pod \"barbican-worker-74d9f6d95f-l289b\" (UID: \"04d07033-b1e8-426d-828b-e78cb0f44294\") " pod="openstack/barbican-worker-74d9f6d95f-l289b" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.053976 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04d07033-b1e8-426d-828b-e78cb0f44294-config-data-custom\") pod \"barbican-worker-74d9f6d95f-l289b\" (UID: \"04d07033-b1e8-426d-828b-e78cb0f44294\") " pod="openstack/barbican-worker-74d9f6d95f-l289b" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.057928 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8jhm\" (UniqueName: \"kubernetes.io/projected/04d07033-b1e8-426d-828b-e78cb0f44294-kube-api-access-p8jhm\") pod \"barbican-worker-74d9f6d95f-l289b\" (UID: \"04d07033-b1e8-426d-828b-e78cb0f44294\") " pod="openstack/barbican-worker-74d9f6d95f-l289b" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.063119 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvpxj\" (UniqueName: \"kubernetes.io/projected/95263c08-34fe-4319-ae88-dc01b7609122-kube-api-access-vvpxj\") pod \"dnsmasq-dns-59d5ff467f-5rmz7\" (UID: \"95263c08-34fe-4319-ae88-dc01b7609122\") " pod="openstack/dnsmasq-dns-59d5ff467f-5rmz7" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.099750 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5b679cb88b-5lrzw" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.125317 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-74d9f6d95f-l289b" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.130102 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d926f2ba-fdb2-4eb0-a509-5b9eb13ed626-config-data-custom\") pod \"barbican-api-78d8f8854b-zdr5d\" (UID: \"d926f2ba-fdb2-4eb0-a509-5b9eb13ed626\") " pod="openstack/barbican-api-78d8f8854b-zdr5d" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.130220 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n8gh\" (UniqueName: \"kubernetes.io/projected/d926f2ba-fdb2-4eb0-a509-5b9eb13ed626-kube-api-access-5n8gh\") pod \"barbican-api-78d8f8854b-zdr5d\" (UID: \"d926f2ba-fdb2-4eb0-a509-5b9eb13ed626\") " pod="openstack/barbican-api-78d8f8854b-zdr5d" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.130240 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d926f2ba-fdb2-4eb0-a509-5b9eb13ed626-logs\") pod \"barbican-api-78d8f8854b-zdr5d\" (UID: \"d926f2ba-fdb2-4eb0-a509-5b9eb13ed626\") " pod="openstack/barbican-api-78d8f8854b-zdr5d" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.130261 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d926f2ba-fdb2-4eb0-a509-5b9eb13ed626-config-data\") pod \"barbican-api-78d8f8854b-zdr5d\" (UID: \"d926f2ba-fdb2-4eb0-a509-5b9eb13ed626\") " pod="openstack/barbican-api-78d8f8854b-zdr5d" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.130285 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d926f2ba-fdb2-4eb0-a509-5b9eb13ed626-combined-ca-bundle\") pod \"barbican-api-78d8f8854b-zdr5d\" (UID: \"d926f2ba-fdb2-4eb0-a509-5b9eb13ed626\") " pod="openstack/barbican-api-78d8f8854b-zdr5d" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.131744 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d926f2ba-fdb2-4eb0-a509-5b9eb13ed626-logs\") pod \"barbican-api-78d8f8854b-zdr5d\" (UID: \"d926f2ba-fdb2-4eb0-a509-5b9eb13ed626\") " pod="openstack/barbican-api-78d8f8854b-zdr5d" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.134024 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d926f2ba-fdb2-4eb0-a509-5b9eb13ed626-combined-ca-bundle\") pod \"barbican-api-78d8f8854b-zdr5d\" (UID: \"d926f2ba-fdb2-4eb0-a509-5b9eb13ed626\") " pod="openstack/barbican-api-78d8f8854b-zdr5d" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.135353 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d926f2ba-fdb2-4eb0-a509-5b9eb13ed626-config-data-custom\") pod \"barbican-api-78d8f8854b-zdr5d\" (UID: \"d926f2ba-fdb2-4eb0-a509-5b9eb13ed626\") " pod="openstack/barbican-api-78d8f8854b-zdr5d" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.136341 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d926f2ba-fdb2-4eb0-a509-5b9eb13ed626-config-data\") pod \"barbican-api-78d8f8854b-zdr5d\" (UID: \"d926f2ba-fdb2-4eb0-a509-5b9eb13ed626\") " pod="openstack/barbican-api-78d8f8854b-zdr5d" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.149966 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n8gh\" (UniqueName: \"kubernetes.io/projected/d926f2ba-fdb2-4eb0-a509-5b9eb13ed626-kube-api-access-5n8gh\") pod \"barbican-api-78d8f8854b-zdr5d\" (UID: \"d926f2ba-fdb2-4eb0-a509-5b9eb13ed626\") " pod="openstack/barbican-api-78d8f8854b-zdr5d" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.155650 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5ff467f-5rmz7" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.318859 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-78d8f8854b-zdr5d" Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.455825 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c51760f-5a26-453e-b578-3bc16d784a4a","Type":"ContainerStarted","Data":"858c84834eaf180af71bc0ede5599a2221a672db851281feecbd46e4d4d042c7"} Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.593156 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5b679cb88b-5lrzw"] Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.696668 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-74d9f6d95f-l289b"] Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.801311 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-5rmz7"] Oct 03 15:01:27 crc kubenswrapper[4774]: W1003 15:01:27.807119 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95263c08_34fe_4319_ae88_dc01b7609122.slice/crio-1ad5057f92688cf5eff225d19ca061e8b974a6aafa9b4cb36e190f3f7541cb85 WatchSource:0}: Error finding container 1ad5057f92688cf5eff225d19ca061e8b974a6aafa9b4cb36e190f3f7541cb85: Status 404 returned error can't find the container with id 1ad5057f92688cf5eff225d19ca061e8b974a6aafa9b4cb36e190f3f7541cb85 Oct 03 15:01:27 crc kubenswrapper[4774]: I1003 15:01:27.921136 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-78d8f8854b-zdr5d"] Oct 03 15:01:28 crc kubenswrapper[4774]: I1003 15:01:28.464554 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b679cb88b-5lrzw" event={"ID":"3ceaeb2a-3322-44b9-88ae-5c473721a68f","Type":"ContainerStarted","Data":"a8ad70ad3c34d74ad8bd069f71d2d750a78b02f71525b1a71d73616060c8e027"} Oct 03 15:01:28 crc kubenswrapper[4774]: I1003 15:01:28.466160 4774 generic.go:334] "Generic (PLEG): container finished" podID="95263c08-34fe-4319-ae88-dc01b7609122" containerID="c7870ac41844f9ef4459ef7e800f9d09de4f3d2a179f066890482f91ee2e5963" exitCode=0 Oct 03 15:01:28 crc kubenswrapper[4774]: I1003 15:01:28.466198 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-5rmz7" event={"ID":"95263c08-34fe-4319-ae88-dc01b7609122","Type":"ContainerDied","Data":"c7870ac41844f9ef4459ef7e800f9d09de4f3d2a179f066890482f91ee2e5963"} Oct 03 15:01:28 crc kubenswrapper[4774]: I1003 15:01:28.466216 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-5rmz7" event={"ID":"95263c08-34fe-4319-ae88-dc01b7609122","Type":"ContainerStarted","Data":"1ad5057f92688cf5eff225d19ca061e8b974a6aafa9b4cb36e190f3f7541cb85"} Oct 03 15:01:28 crc kubenswrapper[4774]: I1003 15:01:28.470841 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-74d9f6d95f-l289b" event={"ID":"04d07033-b1e8-426d-828b-e78cb0f44294","Type":"ContainerStarted","Data":"d5388e443597505961d68454072c30a2a2ff79b5d8912e7eb491094c7d57ef4f"} Oct 03 15:01:28 crc kubenswrapper[4774]: I1003 15:01:28.476079 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-78d8f8854b-zdr5d" event={"ID":"d926f2ba-fdb2-4eb0-a509-5b9eb13ed626","Type":"ContainerStarted","Data":"975761272e9cd5eaae32f0a2f20a24e0dd1eb200f15996617c757a59662d1f70"} Oct 03 15:01:28 crc kubenswrapper[4774]: I1003 15:01:28.476126 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-78d8f8854b-zdr5d" event={"ID":"d926f2ba-fdb2-4eb0-a509-5b9eb13ed626","Type":"ContainerStarted","Data":"d488a05490e9808dd70e8c58e9e89fee94ce23e580e98baaaaa1f1c7b20919cd"} Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.443455 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c54b8bfd5-ftr47" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.499998 4774 generic.go:334] "Generic (PLEG): container finished" podID="e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a" containerID="070d22da48804090a3d16634343e38782fa1cfae91b0c20b52e74ac8cb456ea9" exitCode=137 Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.500027 4774 generic.go:334] "Generic (PLEG): container finished" podID="e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a" containerID="0d7e8b2630f6e23e079e498a4d690a2d7552d2b105fae7efa228ba39fd181246" exitCode=137 Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.500057 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-765f99df69-hhjpp" event={"ID":"e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a","Type":"ContainerDied","Data":"070d22da48804090a3d16634343e38782fa1cfae91b0c20b52e74ac8cb456ea9"} Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.500083 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-765f99df69-hhjpp" event={"ID":"e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a","Type":"ContainerDied","Data":"0d7e8b2630f6e23e079e498a4d690a2d7552d2b105fae7efa228ba39fd181246"} Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.501335 4774 generic.go:334] "Generic (PLEG): container finished" podID="871f7d16-54b6-4aa9-8e99-00a888d41f70" containerID="af677314a00ddf8adf8d016fd4a1bdce9452562c501a73facceeff916c357b0b" exitCode=0 Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.501364 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bc5bdf456-xt2x4" event={"ID":"871f7d16-54b6-4aa9-8e99-00a888d41f70","Type":"ContainerDied","Data":"af677314a00ddf8adf8d016fd4a1bdce9452562c501a73facceeff916c357b0b"} Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.520627 4774 generic.go:334] "Generic (PLEG): container finished" podID="fc47f9df-51ec-4aad-861b-c04b1321c5a3" containerID="247a9528a6cf7a41ed40e9c7c10c6a24c2c7ff0accb2b9421e74e4e566487827" exitCode=137 Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.520683 4774 generic.go:334] "Generic (PLEG): container finished" podID="fc47f9df-51ec-4aad-861b-c04b1321c5a3" containerID="2a9b54c31753f543f9667511be61815d3355dd815fbe91ce4cfab5a8a287e841" exitCode=137 Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.520792 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c54b8bfd5-ftr47" event={"ID":"fc47f9df-51ec-4aad-861b-c04b1321c5a3","Type":"ContainerDied","Data":"247a9528a6cf7a41ed40e9c7c10c6a24c2c7ff0accb2b9421e74e4e566487827"} Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.520832 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c54b8bfd5-ftr47" event={"ID":"fc47f9df-51ec-4aad-861b-c04b1321c5a3","Type":"ContainerDied","Data":"2a9b54c31753f543f9667511be61815d3355dd815fbe91ce4cfab5a8a287e841"} Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.520848 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c54b8bfd5-ftr47" event={"ID":"fc47f9df-51ec-4aad-861b-c04b1321c5a3","Type":"ContainerDied","Data":"6b0f81373fdb8f9647deae4a695905b55eded21f10c5441e988626234a4c1453"} Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.520871 4774 scope.go:117] "RemoveContainer" containerID="247a9528a6cf7a41ed40e9c7c10c6a24c2c7ff0accb2b9421e74e4e566487827" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.521099 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c54b8bfd5-ftr47" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.532425 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-5rmz7" event={"ID":"95263c08-34fe-4319-ae88-dc01b7609122","Type":"ContainerStarted","Data":"7e7c0515d861da53fefaf39f8fd79c6c8488d31684c14ea120fd5e118b827dd1"} Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.533660 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59d5ff467f-5rmz7" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.535242 4774 generic.go:334] "Generic (PLEG): container finished" podID="b635609a-ab1c-4691-be86-da83abc3e663" containerID="be81210244b0fa3606b3f1432526e204cc82f1277bf8001d1f74902aad26e524" exitCode=137 Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.535276 4774 generic.go:334] "Generic (PLEG): container finished" podID="b635609a-ab1c-4691-be86-da83abc3e663" containerID="d0e93c9919c9cc5e5ca30ec385dfc4897c2c951e0d054737ed3f23c89bc0759d" exitCode=137 Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.535323 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-587669876f-dmzh9" event={"ID":"b635609a-ab1c-4691-be86-da83abc3e663","Type":"ContainerDied","Data":"be81210244b0fa3606b3f1432526e204cc82f1277bf8001d1f74902aad26e524"} Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.535342 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-587669876f-dmzh9" event={"ID":"b635609a-ab1c-4691-be86-da83abc3e663","Type":"ContainerDied","Data":"d0e93c9919c9cc5e5ca30ec385dfc4897c2c951e0d054737ed3f23c89bc0759d"} Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.538002 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c51760f-5a26-453e-b578-3bc16d784a4a","Type":"ContainerStarted","Data":"558706c0c65d7eb0e5d8e36f2210d186dcc18a506673e23cbe79dcc06a747ecf"} Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.554079 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-78d8f8854b-zdr5d" event={"ID":"d926f2ba-fdb2-4eb0-a509-5b9eb13ed626","Type":"ContainerStarted","Data":"5625eea49c42c6a502d69e96000ba3039668090d6a81040b9b904c035bd8a32c"} Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.554932 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-78d8f8854b-zdr5d" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.554965 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-78d8f8854b-zdr5d" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.572320 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59d5ff467f-5rmz7" podStartSLOduration=3.572303005 podStartE2EDuration="3.572303005s" podCreationTimestamp="2025-10-03 15:01:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:01:29.568142671 +0000 UTC m=+1112.157346133" watchObservedRunningTime="2025-10-03 15:01:29.572303005 +0000 UTC m=+1112.161506457" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.582624 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc47f9df-51ec-4aad-861b-c04b1321c5a3-config-data\") pod \"fc47f9df-51ec-4aad-861b-c04b1321c5a3\" (UID: \"fc47f9df-51ec-4aad-861b-c04b1321c5a3\") " Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.582727 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc47f9df-51ec-4aad-861b-c04b1321c5a3-logs\") pod \"fc47f9df-51ec-4aad-861b-c04b1321c5a3\" (UID: \"fc47f9df-51ec-4aad-861b-c04b1321c5a3\") " Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.582865 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc47f9df-51ec-4aad-861b-c04b1321c5a3-scripts\") pod \"fc47f9df-51ec-4aad-861b-c04b1321c5a3\" (UID: \"fc47f9df-51ec-4aad-861b-c04b1321c5a3\") " Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.582944 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5j2h\" (UniqueName: \"kubernetes.io/projected/fc47f9df-51ec-4aad-861b-c04b1321c5a3-kube-api-access-j5j2h\") pod \"fc47f9df-51ec-4aad-861b-c04b1321c5a3\" (UID: \"fc47f9df-51ec-4aad-861b-c04b1321c5a3\") " Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.583050 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fc47f9df-51ec-4aad-861b-c04b1321c5a3-horizon-secret-key\") pod \"fc47f9df-51ec-4aad-861b-c04b1321c5a3\" (UID: \"fc47f9df-51ec-4aad-861b-c04b1321c5a3\") " Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.586575 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc47f9df-51ec-4aad-861b-c04b1321c5a3-logs" (OuterVolumeSpecName: "logs") pod "fc47f9df-51ec-4aad-861b-c04b1321c5a3" (UID: "fc47f9df-51ec-4aad-861b-c04b1321c5a3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.599825 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc47f9df-51ec-4aad-861b-c04b1321c5a3-kube-api-access-j5j2h" (OuterVolumeSpecName: "kube-api-access-j5j2h") pod "fc47f9df-51ec-4aad-861b-c04b1321c5a3" (UID: "fc47f9df-51ec-4aad-861b-c04b1321c5a3"). InnerVolumeSpecName "kube-api-access-j5j2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.602645 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc47f9df-51ec-4aad-861b-c04b1321c5a3-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "fc47f9df-51ec-4aad-861b-c04b1321c5a3" (UID: "fc47f9df-51ec-4aad-861b-c04b1321c5a3"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.604163 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-78d8f8854b-zdr5d" podStartSLOduration=3.604133983 podStartE2EDuration="3.604133983s" podCreationTimestamp="2025-10-03 15:01:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:01:29.59510509 +0000 UTC m=+1112.184308542" watchObservedRunningTime="2025-10-03 15:01:29.604133983 +0000 UTC m=+1112.193337435" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.632665 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc47f9df-51ec-4aad-861b-c04b1321c5a3-scripts" (OuterVolumeSpecName: "scripts") pod "fc47f9df-51ec-4aad-861b-c04b1321c5a3" (UID: "fc47f9df-51ec-4aad-861b-c04b1321c5a3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.653133 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc47f9df-51ec-4aad-861b-c04b1321c5a3-config-data" (OuterVolumeSpecName: "config-data") pod "fc47f9df-51ec-4aad-861b-c04b1321c5a3" (UID: "fc47f9df-51ec-4aad-861b-c04b1321c5a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.685118 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc47f9df-51ec-4aad-861b-c04b1321c5a3-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.685143 4774 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc47f9df-51ec-4aad-861b-c04b1321c5a3-logs\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.685153 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc47f9df-51ec-4aad-861b-c04b1321c5a3-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.685162 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5j2h\" (UniqueName: \"kubernetes.io/projected/fc47f9df-51ec-4aad-861b-c04b1321c5a3-kube-api-access-j5j2h\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.685172 4774 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fc47f9df-51ec-4aad-861b-c04b1321c5a3-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.752039 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-765f99df69-hhjpp" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.768441 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5587b8897d-cknc7"] Oct 03 15:01:29 crc kubenswrapper[4774]: E1003 15:01:29.768903 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a" containerName="horizon-log" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.768920 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a" containerName="horizon-log" Oct 03 15:01:29 crc kubenswrapper[4774]: E1003 15:01:29.768954 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc47f9df-51ec-4aad-861b-c04b1321c5a3" containerName="horizon" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.768963 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc47f9df-51ec-4aad-861b-c04b1321c5a3" containerName="horizon" Oct 03 15:01:29 crc kubenswrapper[4774]: E1003 15:01:29.768985 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a" containerName="horizon" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.768995 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a" containerName="horizon" Oct 03 15:01:29 crc kubenswrapper[4774]: E1003 15:01:29.769010 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc47f9df-51ec-4aad-861b-c04b1321c5a3" containerName="horizon-log" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.769018 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc47f9df-51ec-4aad-861b-c04b1321c5a3" containerName="horizon-log" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.769211 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a" containerName="horizon-log" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.769226 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc47f9df-51ec-4aad-861b-c04b1321c5a3" containerName="horizon-log" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.769239 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc47f9df-51ec-4aad-861b-c04b1321c5a3" containerName="horizon" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.769259 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a" containerName="horizon" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.770357 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5587b8897d-cknc7" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.773070 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.773279 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.786698 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5587b8897d-cknc7"] Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.881408 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c54b8bfd5-ftr47"] Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.887395 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mrsb\" (UniqueName: \"kubernetes.io/projected/e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a-kube-api-access-4mrsb\") pod \"e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a\" (UID: \"e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a\") " Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.887449 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a-logs\") pod \"e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a\" (UID: \"e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a\") " Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.887606 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a-scripts\") pod \"e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a\" (UID: \"e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a\") " Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.887672 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a-horizon-secret-key\") pod \"e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a\" (UID: \"e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a\") " Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.887701 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a-config-data\") pod \"e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a\" (UID: \"e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a\") " Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.888032 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79e2926f-2fec-48fb-95d2-c3afcfec7c4c-public-tls-certs\") pod \"barbican-api-5587b8897d-cknc7\" (UID: \"79e2926f-2fec-48fb-95d2-c3afcfec7c4c\") " pod="openstack/barbican-api-5587b8897d-cknc7" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.888083 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79e2926f-2fec-48fb-95d2-c3afcfec7c4c-logs\") pod \"barbican-api-5587b8897d-cknc7\" (UID: \"79e2926f-2fec-48fb-95d2-c3afcfec7c4c\") " pod="openstack/barbican-api-5587b8897d-cknc7" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.888156 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79e2926f-2fec-48fb-95d2-c3afcfec7c4c-config-data\") pod \"barbican-api-5587b8897d-cknc7\" (UID: \"79e2926f-2fec-48fb-95d2-c3afcfec7c4c\") " pod="openstack/barbican-api-5587b8897d-cknc7" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.888195 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79e2926f-2fec-48fb-95d2-c3afcfec7c4c-internal-tls-certs\") pod \"barbican-api-5587b8897d-cknc7\" (UID: \"79e2926f-2fec-48fb-95d2-c3afcfec7c4c\") " pod="openstack/barbican-api-5587b8897d-cknc7" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.888241 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp27m\" (UniqueName: \"kubernetes.io/projected/79e2926f-2fec-48fb-95d2-c3afcfec7c4c-kube-api-access-gp27m\") pod \"barbican-api-5587b8897d-cknc7\" (UID: \"79e2926f-2fec-48fb-95d2-c3afcfec7c4c\") " pod="openstack/barbican-api-5587b8897d-cknc7" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.888289 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79e2926f-2fec-48fb-95d2-c3afcfec7c4c-combined-ca-bundle\") pod \"barbican-api-5587b8897d-cknc7\" (UID: \"79e2926f-2fec-48fb-95d2-c3afcfec7c4c\") " pod="openstack/barbican-api-5587b8897d-cknc7" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.888319 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79e2926f-2fec-48fb-95d2-c3afcfec7c4c-config-data-custom\") pod \"barbican-api-5587b8897d-cknc7\" (UID: \"79e2926f-2fec-48fb-95d2-c3afcfec7c4c\") " pod="openstack/barbican-api-5587b8897d-cknc7" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.889744 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a-logs" (OuterVolumeSpecName: "logs") pod "e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a" (UID: "e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.891820 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a-kube-api-access-4mrsb" (OuterVolumeSpecName: "kube-api-access-4mrsb") pod "e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a" (UID: "e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a"). InnerVolumeSpecName "kube-api-access-4mrsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.892766 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7c54b8bfd5-ftr47"] Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.894680 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a" (UID: "e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.914318 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a-scripts" (OuterVolumeSpecName: "scripts") pod "e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a" (UID: "e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.923015 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a-config-data" (OuterVolumeSpecName: "config-data") pod "e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a" (UID: "e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.949699 4774 scope.go:117] "RemoveContainer" containerID="2a9b54c31753f543f9667511be61815d3355dd815fbe91ce4cfab5a8a287e841" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.966551 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-587669876f-dmzh9" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.990143 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79e2926f-2fec-48fb-95d2-c3afcfec7c4c-internal-tls-certs\") pod \"barbican-api-5587b8897d-cknc7\" (UID: \"79e2926f-2fec-48fb-95d2-c3afcfec7c4c\") " pod="openstack/barbican-api-5587b8897d-cknc7" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.990226 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gp27m\" (UniqueName: \"kubernetes.io/projected/79e2926f-2fec-48fb-95d2-c3afcfec7c4c-kube-api-access-gp27m\") pod \"barbican-api-5587b8897d-cknc7\" (UID: \"79e2926f-2fec-48fb-95d2-c3afcfec7c4c\") " pod="openstack/barbican-api-5587b8897d-cknc7" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.990284 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79e2926f-2fec-48fb-95d2-c3afcfec7c4c-combined-ca-bundle\") pod \"barbican-api-5587b8897d-cknc7\" (UID: \"79e2926f-2fec-48fb-95d2-c3afcfec7c4c\") " pod="openstack/barbican-api-5587b8897d-cknc7" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.990309 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79e2926f-2fec-48fb-95d2-c3afcfec7c4c-config-data-custom\") pod \"barbican-api-5587b8897d-cknc7\" (UID: \"79e2926f-2fec-48fb-95d2-c3afcfec7c4c\") " pod="openstack/barbican-api-5587b8897d-cknc7" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.990482 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79e2926f-2fec-48fb-95d2-c3afcfec7c4c-public-tls-certs\") pod \"barbican-api-5587b8897d-cknc7\" (UID: \"79e2926f-2fec-48fb-95d2-c3afcfec7c4c\") " pod="openstack/barbican-api-5587b8897d-cknc7" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.990514 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79e2926f-2fec-48fb-95d2-c3afcfec7c4c-logs\") pod \"barbican-api-5587b8897d-cknc7\" (UID: \"79e2926f-2fec-48fb-95d2-c3afcfec7c4c\") " pod="openstack/barbican-api-5587b8897d-cknc7" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.990592 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79e2926f-2fec-48fb-95d2-c3afcfec7c4c-config-data\") pod \"barbican-api-5587b8897d-cknc7\" (UID: \"79e2926f-2fec-48fb-95d2-c3afcfec7c4c\") " pod="openstack/barbican-api-5587b8897d-cknc7" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.990661 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.990677 4774 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.990686 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.990718 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mrsb\" (UniqueName: \"kubernetes.io/projected/e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a-kube-api-access-4mrsb\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.990726 4774 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a-logs\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.991852 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79e2926f-2fec-48fb-95d2-c3afcfec7c4c-logs\") pod \"barbican-api-5587b8897d-cknc7\" (UID: \"79e2926f-2fec-48fb-95d2-c3afcfec7c4c\") " pod="openstack/barbican-api-5587b8897d-cknc7" Oct 03 15:01:29 crc kubenswrapper[4774]: I1003 15:01:29.994409 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79e2926f-2fec-48fb-95d2-c3afcfec7c4c-internal-tls-certs\") pod \"barbican-api-5587b8897d-cknc7\" (UID: \"79e2926f-2fec-48fb-95d2-c3afcfec7c4c\") " pod="openstack/barbican-api-5587b8897d-cknc7" Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.006394 4774 scope.go:117] "RemoveContainer" containerID="247a9528a6cf7a41ed40e9c7c10c6a24c2c7ff0accb2b9421e74e4e566487827" Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.007671 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79e2926f-2fec-48fb-95d2-c3afcfec7c4c-config-data\") pod \"barbican-api-5587b8897d-cknc7\" (UID: \"79e2926f-2fec-48fb-95d2-c3afcfec7c4c\") " pod="openstack/barbican-api-5587b8897d-cknc7" Oct 03 15:01:30 crc kubenswrapper[4774]: E1003 15:01:30.007758 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"247a9528a6cf7a41ed40e9c7c10c6a24c2c7ff0accb2b9421e74e4e566487827\": container with ID starting with 247a9528a6cf7a41ed40e9c7c10c6a24c2c7ff0accb2b9421e74e4e566487827 not found: ID does not exist" containerID="247a9528a6cf7a41ed40e9c7c10c6a24c2c7ff0accb2b9421e74e4e566487827" Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.007807 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"247a9528a6cf7a41ed40e9c7c10c6a24c2c7ff0accb2b9421e74e4e566487827"} err="failed to get container status \"247a9528a6cf7a41ed40e9c7c10c6a24c2c7ff0accb2b9421e74e4e566487827\": rpc error: code = NotFound desc = could not find container \"247a9528a6cf7a41ed40e9c7c10c6a24c2c7ff0accb2b9421e74e4e566487827\": container with ID starting with 247a9528a6cf7a41ed40e9c7c10c6a24c2c7ff0accb2b9421e74e4e566487827 not found: ID does not exist" Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.007835 4774 scope.go:117] "RemoveContainer" containerID="2a9b54c31753f543f9667511be61815d3355dd815fbe91ce4cfab5a8a287e841" Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.008187 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79e2926f-2fec-48fb-95d2-c3afcfec7c4c-combined-ca-bundle\") pod \"barbican-api-5587b8897d-cknc7\" (UID: \"79e2926f-2fec-48fb-95d2-c3afcfec7c4c\") " pod="openstack/barbican-api-5587b8897d-cknc7" Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.008537 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79e2926f-2fec-48fb-95d2-c3afcfec7c4c-public-tls-certs\") pod \"barbican-api-5587b8897d-cknc7\" (UID: \"79e2926f-2fec-48fb-95d2-c3afcfec7c4c\") " pod="openstack/barbican-api-5587b8897d-cknc7" Oct 03 15:01:30 crc kubenswrapper[4774]: E1003 15:01:30.008789 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a9b54c31753f543f9667511be61815d3355dd815fbe91ce4cfab5a8a287e841\": container with ID starting with 2a9b54c31753f543f9667511be61815d3355dd815fbe91ce4cfab5a8a287e841 not found: ID does not exist" containerID="2a9b54c31753f543f9667511be61815d3355dd815fbe91ce4cfab5a8a287e841" Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.008861 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a9b54c31753f543f9667511be61815d3355dd815fbe91ce4cfab5a8a287e841"} err="failed to get container status \"2a9b54c31753f543f9667511be61815d3355dd815fbe91ce4cfab5a8a287e841\": rpc error: code = NotFound desc = could not find container \"2a9b54c31753f543f9667511be61815d3355dd815fbe91ce4cfab5a8a287e841\": container with ID starting with 2a9b54c31753f543f9667511be61815d3355dd815fbe91ce4cfab5a8a287e841 not found: ID does not exist" Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.008902 4774 scope.go:117] "RemoveContainer" containerID="247a9528a6cf7a41ed40e9c7c10c6a24c2c7ff0accb2b9421e74e4e566487827" Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.009582 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"247a9528a6cf7a41ed40e9c7c10c6a24c2c7ff0accb2b9421e74e4e566487827"} err="failed to get container status \"247a9528a6cf7a41ed40e9c7c10c6a24c2c7ff0accb2b9421e74e4e566487827\": rpc error: code = NotFound desc = could not find container \"247a9528a6cf7a41ed40e9c7c10c6a24c2c7ff0accb2b9421e74e4e566487827\": container with ID starting with 247a9528a6cf7a41ed40e9c7c10c6a24c2c7ff0accb2b9421e74e4e566487827 not found: ID does not exist" Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.009649 4774 scope.go:117] "RemoveContainer" containerID="2a9b54c31753f543f9667511be61815d3355dd815fbe91ce4cfab5a8a287e841" Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.011525 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a9b54c31753f543f9667511be61815d3355dd815fbe91ce4cfab5a8a287e841"} err="failed to get container status \"2a9b54c31753f543f9667511be61815d3355dd815fbe91ce4cfab5a8a287e841\": rpc error: code = NotFound desc = could not find container \"2a9b54c31753f543f9667511be61815d3355dd815fbe91ce4cfab5a8a287e841\": container with ID starting with 2a9b54c31753f543f9667511be61815d3355dd815fbe91ce4cfab5a8a287e841 not found: ID does not exist" Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.013697 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp27m\" (UniqueName: \"kubernetes.io/projected/79e2926f-2fec-48fb-95d2-c3afcfec7c4c-kube-api-access-gp27m\") pod \"barbican-api-5587b8897d-cknc7\" (UID: \"79e2926f-2fec-48fb-95d2-c3afcfec7c4c\") " pod="openstack/barbican-api-5587b8897d-cknc7" Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.024121 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79e2926f-2fec-48fb-95d2-c3afcfec7c4c-config-data-custom\") pod \"barbican-api-5587b8897d-cknc7\" (UID: \"79e2926f-2fec-48fb-95d2-c3afcfec7c4c\") " pod="openstack/barbican-api-5587b8897d-cknc7" Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.091383 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b635609a-ab1c-4691-be86-da83abc3e663-logs\") pod \"b635609a-ab1c-4691-be86-da83abc3e663\" (UID: \"b635609a-ab1c-4691-be86-da83abc3e663\") " Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.091493 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b635609a-ab1c-4691-be86-da83abc3e663-horizon-secret-key\") pod \"b635609a-ab1c-4691-be86-da83abc3e663\" (UID: \"b635609a-ab1c-4691-be86-da83abc3e663\") " Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.091545 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bts6v\" (UniqueName: \"kubernetes.io/projected/b635609a-ab1c-4691-be86-da83abc3e663-kube-api-access-bts6v\") pod \"b635609a-ab1c-4691-be86-da83abc3e663\" (UID: \"b635609a-ab1c-4691-be86-da83abc3e663\") " Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.091664 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b635609a-ab1c-4691-be86-da83abc3e663-scripts\") pod \"b635609a-ab1c-4691-be86-da83abc3e663\" (UID: \"b635609a-ab1c-4691-be86-da83abc3e663\") " Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.091688 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b635609a-ab1c-4691-be86-da83abc3e663-config-data\") pod \"b635609a-ab1c-4691-be86-da83abc3e663\" (UID: \"b635609a-ab1c-4691-be86-da83abc3e663\") " Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.091790 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b635609a-ab1c-4691-be86-da83abc3e663-logs" (OuterVolumeSpecName: "logs") pod "b635609a-ab1c-4691-be86-da83abc3e663" (UID: "b635609a-ab1c-4691-be86-da83abc3e663"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.092142 4774 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b635609a-ab1c-4691-be86-da83abc3e663-logs\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.095414 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b635609a-ab1c-4691-be86-da83abc3e663-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "b635609a-ab1c-4691-be86-da83abc3e663" (UID: "b635609a-ab1c-4691-be86-da83abc3e663"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.095663 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5587b8897d-cknc7" Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.096007 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b635609a-ab1c-4691-be86-da83abc3e663-kube-api-access-bts6v" (OuterVolumeSpecName: "kube-api-access-bts6v") pod "b635609a-ab1c-4691-be86-da83abc3e663" (UID: "b635609a-ab1c-4691-be86-da83abc3e663"). InnerVolumeSpecName "kube-api-access-bts6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.117344 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b635609a-ab1c-4691-be86-da83abc3e663-scripts" (OuterVolumeSpecName: "scripts") pod "b635609a-ab1c-4691-be86-da83abc3e663" (UID: "b635609a-ab1c-4691-be86-da83abc3e663"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.122976 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b635609a-ab1c-4691-be86-da83abc3e663-config-data" (OuterVolumeSpecName: "config-data") pod "b635609a-ab1c-4691-be86-da83abc3e663" (UID: "b635609a-ab1c-4691-be86-da83abc3e663"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.224705 4774 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b635609a-ab1c-4691-be86-da83abc3e663-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.224736 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bts6v\" (UniqueName: \"kubernetes.io/projected/b635609a-ab1c-4691-be86-da83abc3e663-kube-api-access-bts6v\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.224747 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b635609a-ab1c-4691-be86-da83abc3e663-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.224756 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b635609a-ab1c-4691-be86-da83abc3e663-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.567914 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-587669876f-dmzh9" event={"ID":"b635609a-ab1c-4691-be86-da83abc3e663","Type":"ContainerDied","Data":"079f3fb4b4b86f3c8e890f020e1736f355967a408b5692525430b0ad3971aaeb"} Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.568339 4774 scope.go:117] "RemoveContainer" containerID="be81210244b0fa3606b3f1432526e204cc82f1277bf8001d1f74902aad26e524" Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.568635 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-587669876f-dmzh9" Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.572309 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5587b8897d-cknc7"] Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.573272 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-765f99df69-hhjpp" event={"ID":"e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a","Type":"ContainerDied","Data":"b6edec92a6c97390450306ed5b9579cfdb4999e4c83cd44fef8162553648f1d3"} Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.573387 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-765f99df69-hhjpp" Oct 03 15:01:30 crc kubenswrapper[4774]: W1003 15:01:30.591080 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79e2926f_2fec_48fb_95d2_c3afcfec7c4c.slice/crio-98131f31d54649ae483ce96341015f5e9e4e52716758f0229628a0794921acd8 WatchSource:0}: Error finding container 98131f31d54649ae483ce96341015f5e9e4e52716758f0229628a0794921acd8: Status 404 returned error can't find the container with id 98131f31d54649ae483ce96341015f5e9e4e52716758f0229628a0794921acd8 Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.607281 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-587669876f-dmzh9"] Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.613271 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-587669876f-dmzh9"] Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.660966 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-765f99df69-hhjpp"] Oct 03 15:01:30 crc kubenswrapper[4774]: I1003 15:01:30.670250 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-765f99df69-hhjpp"] Oct 03 15:01:31 crc kubenswrapper[4774]: I1003 15:01:31.080094 4774 scope.go:117] "RemoveContainer" containerID="d0e93c9919c9cc5e5ca30ec385dfc4897c2c951e0d054737ed3f23c89bc0759d" Oct 03 15:01:31 crc kubenswrapper[4774]: I1003 15:01:31.318750 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b635609a-ab1c-4691-be86-da83abc3e663" path="/var/lib/kubelet/pods/b635609a-ab1c-4691-be86-da83abc3e663/volumes" Oct 03 15:01:31 crc kubenswrapper[4774]: I1003 15:01:31.320076 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a" path="/var/lib/kubelet/pods/e8e1eca7-5e1d-41bb-be5a-74fbe8c4f86a/volumes" Oct 03 15:01:31 crc kubenswrapper[4774]: I1003 15:01:31.321463 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc47f9df-51ec-4aad-861b-c04b1321c5a3" path="/var/lib/kubelet/pods/fc47f9df-51ec-4aad-861b-c04b1321c5a3/volumes" Oct 03 15:01:31 crc kubenswrapper[4774]: I1003 15:01:31.357743 4774 scope.go:117] "RemoveContainer" containerID="070d22da48804090a3d16634343e38782fa1cfae91b0c20b52e74ac8cb456ea9" Oct 03 15:01:31 crc kubenswrapper[4774]: I1003 15:01:31.587869 4774 generic.go:334] "Generic (PLEG): container finished" podID="be96775d-6115-4c8e-8539-230de2424b0e" containerID="3e491844db25d065fe046339b9812d693015a5f1b6064363a997ecb709ecf30d" exitCode=0 Oct 03 15:01:31 crc kubenswrapper[4774]: I1003 15:01:31.587947 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-sx6cl" event={"ID":"be96775d-6115-4c8e-8539-230de2424b0e","Type":"ContainerDied","Data":"3e491844db25d065fe046339b9812d693015a5f1b6064363a997ecb709ecf30d"} Oct 03 15:01:31 crc kubenswrapper[4774]: I1003 15:01:31.590432 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5587b8897d-cknc7" event={"ID":"79e2926f-2fec-48fb-95d2-c3afcfec7c4c","Type":"ContainerStarted","Data":"98131f31d54649ae483ce96341015f5e9e4e52716758f0229628a0794921acd8"} Oct 03 15:01:31 crc kubenswrapper[4774]: I1003 15:01:31.600835 4774 scope.go:117] "RemoveContainer" containerID="0d7e8b2630f6e23e079e498a4d690a2d7552d2b105fae7efa228ba39fd181246" Oct 03 15:01:31 crc kubenswrapper[4774]: I1003 15:01:31.613865 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-bc5bdf456-xt2x4" podUID="871f7d16-54b6-4aa9-8e99-00a888d41f70" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Oct 03 15:01:32 crc kubenswrapper[4774]: I1003 15:01:32.603314 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-74d9f6d95f-l289b" event={"ID":"04d07033-b1e8-426d-828b-e78cb0f44294","Type":"ContainerStarted","Data":"505eedebdbf6cc7ca9bdc9a4be6445346563a635ac5c07d075717fd5d4f0bb18"} Oct 03 15:01:32 crc kubenswrapper[4774]: I1003 15:01:32.604061 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-74d9f6d95f-l289b" event={"ID":"04d07033-b1e8-426d-828b-e78cb0f44294","Type":"ContainerStarted","Data":"481e6b83df0f57ddc1e3fb5a8fb212314157c1af4d21ecd0897866c95d764f15"} Oct 03 15:01:32 crc kubenswrapper[4774]: I1003 15:01:32.606716 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c51760f-5a26-453e-b578-3bc16d784a4a","Type":"ContainerStarted","Data":"929567c8817b96e7c982eb4b05370573427dbb7f549f86e306e63a28d248bac1"} Oct 03 15:01:32 crc kubenswrapper[4774]: I1003 15:01:32.607363 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 15:01:32 crc kubenswrapper[4774]: I1003 15:01:32.610233 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5587b8897d-cknc7" event={"ID":"79e2926f-2fec-48fb-95d2-c3afcfec7c4c","Type":"ContainerStarted","Data":"6dd278310e792193078b368d443cdeaf823b682110ba4e5824e5c68480ce75f8"} Oct 03 15:01:32 crc kubenswrapper[4774]: I1003 15:01:32.610281 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5587b8897d-cknc7" event={"ID":"79e2926f-2fec-48fb-95d2-c3afcfec7c4c","Type":"ContainerStarted","Data":"f51d8605156656b0e0b0be17c47e4a378033c6c94f15be338fea5ff76c83add8"} Oct 03 15:01:32 crc kubenswrapper[4774]: I1003 15:01:32.611204 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5587b8897d-cknc7" Oct 03 15:01:32 crc kubenswrapper[4774]: I1003 15:01:32.611247 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5587b8897d-cknc7" Oct 03 15:01:32 crc kubenswrapper[4774]: I1003 15:01:32.613461 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b679cb88b-5lrzw" event={"ID":"3ceaeb2a-3322-44b9-88ae-5c473721a68f","Type":"ContainerStarted","Data":"45ddf2354434830b666ea59bf4de9aac325749706b0f396c7cbb2fb02ed11401"} Oct 03 15:01:32 crc kubenswrapper[4774]: I1003 15:01:32.613495 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b679cb88b-5lrzw" event={"ID":"3ceaeb2a-3322-44b9-88ae-5c473721a68f","Type":"ContainerStarted","Data":"cb482a3dd610492916589c945080834834572ed73819cd591dbe1ab249a1e9eb"} Oct 03 15:01:32 crc kubenswrapper[4774]: I1003 15:01:32.629740 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-74d9f6d95f-l289b" podStartSLOduration=2.733519564 podStartE2EDuration="6.629717694s" podCreationTimestamp="2025-10-03 15:01:26 +0000 UTC" firstStartedPulling="2025-10-03 15:01:27.704830948 +0000 UTC m=+1110.294034400" lastFinishedPulling="2025-10-03 15:01:31.601029078 +0000 UTC m=+1114.190232530" observedRunningTime="2025-10-03 15:01:32.62349992 +0000 UTC m=+1115.212703372" watchObservedRunningTime="2025-10-03 15:01:32.629717694 +0000 UTC m=+1115.218921146" Oct 03 15:01:32 crc kubenswrapper[4774]: I1003 15:01:32.661323 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5587b8897d-cknc7" podStartSLOduration=3.661273857 podStartE2EDuration="3.661273857s" podCreationTimestamp="2025-10-03 15:01:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:01:32.657130214 +0000 UTC m=+1115.246333676" watchObservedRunningTime="2025-10-03 15:01:32.661273857 +0000 UTC m=+1115.250477309" Oct 03 15:01:32 crc kubenswrapper[4774]: I1003 15:01:32.684706 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5b679cb88b-5lrzw" podStartSLOduration=2.702796744 podStartE2EDuration="6.684684137s" podCreationTimestamp="2025-10-03 15:01:26 +0000 UTC" firstStartedPulling="2025-10-03 15:01:27.619581706 +0000 UTC m=+1110.208785158" lastFinishedPulling="2025-10-03 15:01:31.601469099 +0000 UTC m=+1114.190672551" observedRunningTime="2025-10-03 15:01:32.683961189 +0000 UTC m=+1115.273164641" watchObservedRunningTime="2025-10-03 15:01:32.684684137 +0000 UTC m=+1115.273887589" Oct 03 15:01:32 crc kubenswrapper[4774]: I1003 15:01:32.712357 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.478733773 podStartE2EDuration="9.712335712s" podCreationTimestamp="2025-10-03 15:01:23 +0000 UTC" firstStartedPulling="2025-10-03 15:01:24.367140582 +0000 UTC m=+1106.956344034" lastFinishedPulling="2025-10-03 15:01:31.600742501 +0000 UTC m=+1114.189945973" observedRunningTime="2025-10-03 15:01:32.70580527 +0000 UTC m=+1115.295008722" watchObservedRunningTime="2025-10-03 15:01:32.712335712 +0000 UTC m=+1115.301539164" Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.046820 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-sx6cl" Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.177079 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/be96775d-6115-4c8e-8539-230de2424b0e-db-sync-config-data\") pod \"be96775d-6115-4c8e-8539-230de2424b0e\" (UID: \"be96775d-6115-4c8e-8539-230de2424b0e\") " Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.177131 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/be96775d-6115-4c8e-8539-230de2424b0e-etc-machine-id\") pod \"be96775d-6115-4c8e-8539-230de2424b0e\" (UID: \"be96775d-6115-4c8e-8539-230de2424b0e\") " Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.177206 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-972jg\" (UniqueName: \"kubernetes.io/projected/be96775d-6115-4c8e-8539-230de2424b0e-kube-api-access-972jg\") pod \"be96775d-6115-4c8e-8539-230de2424b0e\" (UID: \"be96775d-6115-4c8e-8539-230de2424b0e\") " Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.177303 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be96775d-6115-4c8e-8539-230de2424b0e-config-data\") pod \"be96775d-6115-4c8e-8539-230de2424b0e\" (UID: \"be96775d-6115-4c8e-8539-230de2424b0e\") " Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.177322 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be96775d-6115-4c8e-8539-230de2424b0e-combined-ca-bundle\") pod \"be96775d-6115-4c8e-8539-230de2424b0e\" (UID: \"be96775d-6115-4c8e-8539-230de2424b0e\") " Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.177403 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be96775d-6115-4c8e-8539-230de2424b0e-scripts\") pod \"be96775d-6115-4c8e-8539-230de2424b0e\" (UID: \"be96775d-6115-4c8e-8539-230de2424b0e\") " Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.183150 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/be96775d-6115-4c8e-8539-230de2424b0e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "be96775d-6115-4c8e-8539-230de2424b0e" (UID: "be96775d-6115-4c8e-8539-230de2424b0e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.187550 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be96775d-6115-4c8e-8539-230de2424b0e-scripts" (OuterVolumeSpecName: "scripts") pod "be96775d-6115-4c8e-8539-230de2424b0e" (UID: "be96775d-6115-4c8e-8539-230de2424b0e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.187567 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be96775d-6115-4c8e-8539-230de2424b0e-kube-api-access-972jg" (OuterVolumeSpecName: "kube-api-access-972jg") pod "be96775d-6115-4c8e-8539-230de2424b0e" (UID: "be96775d-6115-4c8e-8539-230de2424b0e"). InnerVolumeSpecName "kube-api-access-972jg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.187652 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be96775d-6115-4c8e-8539-230de2424b0e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "be96775d-6115-4c8e-8539-230de2424b0e" (UID: "be96775d-6115-4c8e-8539-230de2424b0e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.215467 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be96775d-6115-4c8e-8539-230de2424b0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be96775d-6115-4c8e-8539-230de2424b0e" (UID: "be96775d-6115-4c8e-8539-230de2424b0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.261821 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be96775d-6115-4c8e-8539-230de2424b0e-config-data" (OuterVolumeSpecName: "config-data") pod "be96775d-6115-4c8e-8539-230de2424b0e" (UID: "be96775d-6115-4c8e-8539-230de2424b0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.280409 4774 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/be96775d-6115-4c8e-8539-230de2424b0e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.280440 4774 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/be96775d-6115-4c8e-8539-230de2424b0e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.280450 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-972jg\" (UniqueName: \"kubernetes.io/projected/be96775d-6115-4c8e-8539-230de2424b0e-kube-api-access-972jg\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.280462 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be96775d-6115-4c8e-8539-230de2424b0e-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.280470 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be96775d-6115-4c8e-8539-230de2424b0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.280478 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be96775d-6115-4c8e-8539-230de2424b0e-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.642319 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-sx6cl" event={"ID":"be96775d-6115-4c8e-8539-230de2424b0e","Type":"ContainerDied","Data":"8d38e8df35eb5aa4bf7257373ce1154d0af8bdfde0cd1c44302f4b29d61d7f40"} Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.642650 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d38e8df35eb5aa4bf7257373ce1154d0af8bdfde0cd1c44302f4b29d61d7f40" Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.643091 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-sx6cl" Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.932324 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 15:01:33 crc kubenswrapper[4774]: E1003 15:01:33.932698 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b635609a-ab1c-4691-be86-da83abc3e663" containerName="horizon" Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.932712 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="b635609a-ab1c-4691-be86-da83abc3e663" containerName="horizon" Oct 03 15:01:33 crc kubenswrapper[4774]: E1003 15:01:33.932738 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b635609a-ab1c-4691-be86-da83abc3e663" containerName="horizon-log" Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.932744 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="b635609a-ab1c-4691-be86-da83abc3e663" containerName="horizon-log" Oct 03 15:01:33 crc kubenswrapper[4774]: E1003 15:01:33.932758 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be96775d-6115-4c8e-8539-230de2424b0e" containerName="cinder-db-sync" Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.932763 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="be96775d-6115-4c8e-8539-230de2424b0e" containerName="cinder-db-sync" Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.932940 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="b635609a-ab1c-4691-be86-da83abc3e663" containerName="horizon" Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.932948 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="b635609a-ab1c-4691-be86-da83abc3e663" containerName="horizon-log" Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.932965 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="be96775d-6115-4c8e-8539-230de2424b0e" containerName="cinder-db-sync" Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.933894 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.936537 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.936723 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.937703 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.945131 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.949614 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-6swnb" Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.987006 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-5rmz7"] Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.987228 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59d5ff467f-5rmz7" podUID="95263c08-34fe-4319-ae88-dc01b7609122" containerName="dnsmasq-dns" containerID="cri-o://7e7c0515d861da53fefaf39f8fd79c6c8488d31684c14ea120fd5e118b827dd1" gracePeriod=10 Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.993668 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9ab201e1-9ef3-485b-81f2-0b421dcc66cc\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.993745 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8tms\" (UniqueName: \"kubernetes.io/projected/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-kube-api-access-k8tms\") pod \"cinder-scheduler-0\" (UID: \"9ab201e1-9ef3-485b-81f2-0b421dcc66cc\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.993799 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-scripts\") pod \"cinder-scheduler-0\" (UID: \"9ab201e1-9ef3-485b-81f2-0b421dcc66cc\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.993817 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-config-data\") pod \"cinder-scheduler-0\" (UID: \"9ab201e1-9ef3-485b-81f2-0b421dcc66cc\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.993900 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9ab201e1-9ef3-485b-81f2-0b421dcc66cc\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:33 crc kubenswrapper[4774]: I1003 15:01:33.993930 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9ab201e1-9ef3-485b-81f2-0b421dcc66cc\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.000548 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59d5ff467f-5rmz7" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.070556 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-fkddj"] Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.072053 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c986f6d7-fkddj" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.083044 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-fkddj"] Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.097218 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9ab201e1-9ef3-485b-81f2-0b421dcc66cc\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.098562 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9ab201e1-9ef3-485b-81f2-0b421dcc66cc\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.098598 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9ab201e1-9ef3-485b-81f2-0b421dcc66cc\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.098682 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8tms\" (UniqueName: \"kubernetes.io/projected/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-kube-api-access-k8tms\") pod \"cinder-scheduler-0\" (UID: \"9ab201e1-9ef3-485b-81f2-0b421dcc66cc\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.098968 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-scripts\") pod \"cinder-scheduler-0\" (UID: \"9ab201e1-9ef3-485b-81f2-0b421dcc66cc\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.098996 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-config-data\") pod \"cinder-scheduler-0\" (UID: \"9ab201e1-9ef3-485b-81f2-0b421dcc66cc\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.107039 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9ab201e1-9ef3-485b-81f2-0b421dcc66cc\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.107740 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-config-data\") pod \"cinder-scheduler-0\" (UID: \"9ab201e1-9ef3-485b-81f2-0b421dcc66cc\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.127917 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-scripts\") pod \"cinder-scheduler-0\" (UID: \"9ab201e1-9ef3-485b-81f2-0b421dcc66cc\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.146558 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9ab201e1-9ef3-485b-81f2-0b421dcc66cc\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.150497 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9ab201e1-9ef3-485b-81f2-0b421dcc66cc\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.173918 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8tms\" (UniqueName: \"kubernetes.io/projected/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-kube-api-access-k8tms\") pod \"cinder-scheduler-0\" (UID: \"9ab201e1-9ef3-485b-81f2-0b421dcc66cc\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.182425 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.183908 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.187666 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.194243 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.203039 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87919e0f-0800-44c9-863a-903c14884ae8-ovsdbserver-nb\") pod \"dnsmasq-dns-69c986f6d7-fkddj\" (UID: \"87919e0f-0800-44c9-863a-903c14884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-fkddj" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.203100 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87919e0f-0800-44c9-863a-903c14884ae8-dns-svc\") pod \"dnsmasq-dns-69c986f6d7-fkddj\" (UID: \"87919e0f-0800-44c9-863a-903c14884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-fkddj" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.203145 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87919e0f-0800-44c9-863a-903c14884ae8-dns-swift-storage-0\") pod \"dnsmasq-dns-69c986f6d7-fkddj\" (UID: \"87919e0f-0800-44c9-863a-903c14884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-fkddj" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.203192 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7vxx\" (UniqueName: \"kubernetes.io/projected/87919e0f-0800-44c9-863a-903c14884ae8-kube-api-access-s7vxx\") pod \"dnsmasq-dns-69c986f6d7-fkddj\" (UID: \"87919e0f-0800-44c9-863a-903c14884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-fkddj" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.203234 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87919e0f-0800-44c9-863a-903c14884ae8-ovsdbserver-sb\") pod \"dnsmasq-dns-69c986f6d7-fkddj\" (UID: \"87919e0f-0800-44c9-863a-903c14884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-fkddj" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.203261 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87919e0f-0800-44c9-863a-903c14884ae8-config\") pod \"dnsmasq-dns-69c986f6d7-fkddj\" (UID: \"87919e0f-0800-44c9-863a-903c14884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-fkddj" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.253816 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.306530 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87919e0f-0800-44c9-863a-903c14884ae8-ovsdbserver-nb\") pod \"dnsmasq-dns-69c986f6d7-fkddj\" (UID: \"87919e0f-0800-44c9-863a-903c14884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-fkddj" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.306810 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-config-data\") pod \"cinder-api-0\" (UID: \"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2\") " pod="openstack/cinder-api-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.306853 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87919e0f-0800-44c9-863a-903c14884ae8-dns-svc\") pod \"dnsmasq-dns-69c986f6d7-fkddj\" (UID: \"87919e0f-0800-44c9-863a-903c14884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-fkddj" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.306871 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-config-data-custom\") pod \"cinder-api-0\" (UID: \"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2\") " pod="openstack/cinder-api-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.306905 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2\") " pod="openstack/cinder-api-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.306936 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87919e0f-0800-44c9-863a-903c14884ae8-dns-swift-storage-0\") pod \"dnsmasq-dns-69c986f6d7-fkddj\" (UID: \"87919e0f-0800-44c9-863a-903c14884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-fkddj" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.306953 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-scripts\") pod \"cinder-api-0\" (UID: \"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2\") " pod="openstack/cinder-api-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.306971 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2\") " pod="openstack/cinder-api-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.306993 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-logs\") pod \"cinder-api-0\" (UID: \"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2\") " pod="openstack/cinder-api-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.307023 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7vxx\" (UniqueName: \"kubernetes.io/projected/87919e0f-0800-44c9-863a-903c14884ae8-kube-api-access-s7vxx\") pod \"dnsmasq-dns-69c986f6d7-fkddj\" (UID: \"87919e0f-0800-44c9-863a-903c14884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-fkddj" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.307053 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87919e0f-0800-44c9-863a-903c14884ae8-ovsdbserver-sb\") pod \"dnsmasq-dns-69c986f6d7-fkddj\" (UID: \"87919e0f-0800-44c9-863a-903c14884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-fkddj" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.307070 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87919e0f-0800-44c9-863a-903c14884ae8-config\") pod \"dnsmasq-dns-69c986f6d7-fkddj\" (UID: \"87919e0f-0800-44c9-863a-903c14884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-fkddj" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.307118 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6k7w\" (UniqueName: \"kubernetes.io/projected/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-kube-api-access-z6k7w\") pod \"cinder-api-0\" (UID: \"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2\") " pod="openstack/cinder-api-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.307415 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87919e0f-0800-44c9-863a-903c14884ae8-ovsdbserver-nb\") pod \"dnsmasq-dns-69c986f6d7-fkddj\" (UID: \"87919e0f-0800-44c9-863a-903c14884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-fkddj" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.307848 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87919e0f-0800-44c9-863a-903c14884ae8-dns-svc\") pod \"dnsmasq-dns-69c986f6d7-fkddj\" (UID: \"87919e0f-0800-44c9-863a-903c14884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-fkddj" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.308057 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87919e0f-0800-44c9-863a-903c14884ae8-dns-swift-storage-0\") pod \"dnsmasq-dns-69c986f6d7-fkddj\" (UID: \"87919e0f-0800-44c9-863a-903c14884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-fkddj" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.308226 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87919e0f-0800-44c9-863a-903c14884ae8-config\") pod \"dnsmasq-dns-69c986f6d7-fkddj\" (UID: \"87919e0f-0800-44c9-863a-903c14884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-fkddj" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.308738 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87919e0f-0800-44c9-863a-903c14884ae8-ovsdbserver-sb\") pod \"dnsmasq-dns-69c986f6d7-fkddj\" (UID: \"87919e0f-0800-44c9-863a-903c14884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-fkddj" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.330210 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7vxx\" (UniqueName: \"kubernetes.io/projected/87919e0f-0800-44c9-863a-903c14884ae8-kube-api-access-s7vxx\") pod \"dnsmasq-dns-69c986f6d7-fkddj\" (UID: \"87919e0f-0800-44c9-863a-903c14884ae8\") " pod="openstack/dnsmasq-dns-69c986f6d7-fkddj" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.368474 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-bc86db558-frxdt" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.374765 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-bc86db558-frxdt" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.404591 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c986f6d7-fkddj" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.408226 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-config-data\") pod \"cinder-api-0\" (UID: \"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2\") " pod="openstack/cinder-api-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.408287 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-config-data-custom\") pod \"cinder-api-0\" (UID: \"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2\") " pod="openstack/cinder-api-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.408342 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2\") " pod="openstack/cinder-api-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.408386 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-scripts\") pod \"cinder-api-0\" (UID: \"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2\") " pod="openstack/cinder-api-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.408409 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2\") " pod="openstack/cinder-api-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.408435 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-logs\") pod \"cinder-api-0\" (UID: \"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2\") " pod="openstack/cinder-api-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.408524 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6k7w\" (UniqueName: \"kubernetes.io/projected/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-kube-api-access-z6k7w\") pod \"cinder-api-0\" (UID: \"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2\") " pod="openstack/cinder-api-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.412490 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2\") " pod="openstack/cinder-api-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.412908 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-logs\") pod \"cinder-api-0\" (UID: \"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2\") " pod="openstack/cinder-api-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.414406 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-scripts\") pod \"cinder-api-0\" (UID: \"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2\") " pod="openstack/cinder-api-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.414657 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-config-data\") pod \"cinder-api-0\" (UID: \"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2\") " pod="openstack/cinder-api-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.422142 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2\") " pod="openstack/cinder-api-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.429040 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-config-data-custom\") pod \"cinder-api-0\" (UID: \"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2\") " pod="openstack/cinder-api-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.459976 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6k7w\" (UniqueName: \"kubernetes.io/projected/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-kube-api-access-z6k7w\") pod \"cinder-api-0\" (UID: \"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2\") " pod="openstack/cinder-api-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.578821 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.715942 4774 generic.go:334] "Generic (PLEG): container finished" podID="95263c08-34fe-4319-ae88-dc01b7609122" containerID="7e7c0515d861da53fefaf39f8fd79c6c8488d31684c14ea120fd5e118b827dd1" exitCode=0 Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.716433 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-5rmz7" event={"ID":"95263c08-34fe-4319-ae88-dc01b7609122","Type":"ContainerDied","Data":"7e7c0515d861da53fefaf39f8fd79c6c8488d31684c14ea120fd5e118b827dd1"} Oct 03 15:01:34 crc kubenswrapper[4774]: I1003 15:01:34.881281 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5ff467f-5rmz7" Oct 03 15:01:35 crc kubenswrapper[4774]: I1003 15:01:35.027147 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/95263c08-34fe-4319-ae88-dc01b7609122-dns-swift-storage-0\") pod \"95263c08-34fe-4319-ae88-dc01b7609122\" (UID: \"95263c08-34fe-4319-ae88-dc01b7609122\") " Oct 03 15:01:35 crc kubenswrapper[4774]: I1003 15:01:35.027529 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvpxj\" (UniqueName: \"kubernetes.io/projected/95263c08-34fe-4319-ae88-dc01b7609122-kube-api-access-vvpxj\") pod \"95263c08-34fe-4319-ae88-dc01b7609122\" (UID: \"95263c08-34fe-4319-ae88-dc01b7609122\") " Oct 03 15:01:35 crc kubenswrapper[4774]: I1003 15:01:35.028148 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95263c08-34fe-4319-ae88-dc01b7609122-config\") pod \"95263c08-34fe-4319-ae88-dc01b7609122\" (UID: \"95263c08-34fe-4319-ae88-dc01b7609122\") " Oct 03 15:01:35 crc kubenswrapper[4774]: I1003 15:01:35.028513 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95263c08-34fe-4319-ae88-dc01b7609122-ovsdbserver-nb\") pod \"95263c08-34fe-4319-ae88-dc01b7609122\" (UID: \"95263c08-34fe-4319-ae88-dc01b7609122\") " Oct 03 15:01:35 crc kubenswrapper[4774]: I1003 15:01:35.028692 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95263c08-34fe-4319-ae88-dc01b7609122-ovsdbserver-sb\") pod \"95263c08-34fe-4319-ae88-dc01b7609122\" (UID: \"95263c08-34fe-4319-ae88-dc01b7609122\") " Oct 03 15:01:35 crc kubenswrapper[4774]: I1003 15:01:35.028856 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95263c08-34fe-4319-ae88-dc01b7609122-dns-svc\") pod \"95263c08-34fe-4319-ae88-dc01b7609122\" (UID: \"95263c08-34fe-4319-ae88-dc01b7609122\") " Oct 03 15:01:35 crc kubenswrapper[4774]: I1003 15:01:35.031796 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95263c08-34fe-4319-ae88-dc01b7609122-kube-api-access-vvpxj" (OuterVolumeSpecName: "kube-api-access-vvpxj") pod "95263c08-34fe-4319-ae88-dc01b7609122" (UID: "95263c08-34fe-4319-ae88-dc01b7609122"). InnerVolumeSpecName "kube-api-access-vvpxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:01:35 crc kubenswrapper[4774]: I1003 15:01:35.078391 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 15:01:35 crc kubenswrapper[4774]: I1003 15:01:35.130712 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvpxj\" (UniqueName: \"kubernetes.io/projected/95263c08-34fe-4319-ae88-dc01b7609122-kube-api-access-vvpxj\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:35 crc kubenswrapper[4774]: I1003 15:01:35.141579 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95263c08-34fe-4319-ae88-dc01b7609122-config" (OuterVolumeSpecName: "config") pod "95263c08-34fe-4319-ae88-dc01b7609122" (UID: "95263c08-34fe-4319-ae88-dc01b7609122"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:01:35 crc kubenswrapper[4774]: I1003 15:01:35.143717 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95263c08-34fe-4319-ae88-dc01b7609122-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "95263c08-34fe-4319-ae88-dc01b7609122" (UID: "95263c08-34fe-4319-ae88-dc01b7609122"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:01:35 crc kubenswrapper[4774]: I1003 15:01:35.147309 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95263c08-34fe-4319-ae88-dc01b7609122-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "95263c08-34fe-4319-ae88-dc01b7609122" (UID: "95263c08-34fe-4319-ae88-dc01b7609122"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:01:35 crc kubenswrapper[4774]: I1003 15:01:35.156831 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95263c08-34fe-4319-ae88-dc01b7609122-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "95263c08-34fe-4319-ae88-dc01b7609122" (UID: "95263c08-34fe-4319-ae88-dc01b7609122"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:01:35 crc kubenswrapper[4774]: I1003 15:01:35.163861 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95263c08-34fe-4319-ae88-dc01b7609122-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "95263c08-34fe-4319-ae88-dc01b7609122" (UID: "95263c08-34fe-4319-ae88-dc01b7609122"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:01:35 crc kubenswrapper[4774]: I1003 15:01:35.224910 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-fkddj"] Oct 03 15:01:35 crc kubenswrapper[4774]: I1003 15:01:35.232452 4774 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95263c08-34fe-4319-ae88-dc01b7609122-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:35 crc kubenswrapper[4774]: I1003 15:01:35.232768 4774 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/95263c08-34fe-4319-ae88-dc01b7609122-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:35 crc kubenswrapper[4774]: I1003 15:01:35.232780 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95263c08-34fe-4319-ae88-dc01b7609122-config\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:35 crc kubenswrapper[4774]: I1003 15:01:35.232789 4774 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95263c08-34fe-4319-ae88-dc01b7609122-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:35 crc kubenswrapper[4774]: I1003 15:01:35.232798 4774 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95263c08-34fe-4319-ae88-dc01b7609122-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:35 crc kubenswrapper[4774]: I1003 15:01:35.360772 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 15:01:35 crc kubenswrapper[4774]: W1003 15:01:35.365666 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc3d42b5_126b_433f_b677_5fc8a2f3f1e2.slice/crio-e6e0b7db5f32e092e9c7de8ba6637494f34e877bad9b0ac96f37dbfff9dd0072 WatchSource:0}: Error finding container e6e0b7db5f32e092e9c7de8ba6637494f34e877bad9b0ac96f37dbfff9dd0072: Status 404 returned error can't find the container with id e6e0b7db5f32e092e9c7de8ba6637494f34e877bad9b0ac96f37dbfff9dd0072 Oct 03 15:01:35 crc kubenswrapper[4774]: E1003 15:01:35.656219 4774 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87919e0f_0800_44c9_863a_903c14884ae8.slice/crio-conmon-619db68667a34acd223ae1773ed5cc9757e9f64da3f5d42e6da87c5a988331fb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87919e0f_0800_44c9_863a_903c14884ae8.slice/crio-619db68667a34acd223ae1773ed5cc9757e9f64da3f5d42e6da87c5a988331fb.scope\": RecentStats: unable to find data in memory cache]" Oct 03 15:01:35 crc kubenswrapper[4774]: I1003 15:01:35.726501 4774 generic.go:334] "Generic (PLEG): container finished" podID="cdc24810-0778-4b37-8156-ecac9ae9e077" containerID="ce201ca2e39eab9f9957879b76b9964af29817ef6a45ec70c4b43086049a53a1" exitCode=0 Oct 03 15:01:35 crc kubenswrapper[4774]: I1003 15:01:35.726564 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-66tpj" event={"ID":"cdc24810-0778-4b37-8156-ecac9ae9e077","Type":"ContainerDied","Data":"ce201ca2e39eab9f9957879b76b9964af29817ef6a45ec70c4b43086049a53a1"} Oct 03 15:01:35 crc kubenswrapper[4774]: I1003 15:01:35.739212 4774 generic.go:334] "Generic (PLEG): container finished" podID="87919e0f-0800-44c9-863a-903c14884ae8" containerID="619db68667a34acd223ae1773ed5cc9757e9f64da3f5d42e6da87c5a988331fb" exitCode=0 Oct 03 15:01:35 crc kubenswrapper[4774]: I1003 15:01:35.739324 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c986f6d7-fkddj" event={"ID":"87919e0f-0800-44c9-863a-903c14884ae8","Type":"ContainerDied","Data":"619db68667a34acd223ae1773ed5cc9757e9f64da3f5d42e6da87c5a988331fb"} Oct 03 15:01:35 crc kubenswrapper[4774]: I1003 15:01:35.739356 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c986f6d7-fkddj" event={"ID":"87919e0f-0800-44c9-863a-903c14884ae8","Type":"ContainerStarted","Data":"557a3407c5ed0384afe9bfa3f1ca54193ff85e1b18e03df7017b6c2f3f3011fa"} Oct 03 15:01:35 crc kubenswrapper[4774]: I1003 15:01:35.748357 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-5rmz7" event={"ID":"95263c08-34fe-4319-ae88-dc01b7609122","Type":"ContainerDied","Data":"1ad5057f92688cf5eff225d19ca061e8b974a6aafa9b4cb36e190f3f7541cb85"} Oct 03 15:01:35 crc kubenswrapper[4774]: I1003 15:01:35.748421 4774 scope.go:117] "RemoveContainer" containerID="7e7c0515d861da53fefaf39f8fd79c6c8488d31684c14ea120fd5e118b827dd1" Oct 03 15:01:35 crc kubenswrapper[4774]: I1003 15:01:35.748572 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5ff467f-5rmz7" Oct 03 15:01:35 crc kubenswrapper[4774]: I1003 15:01:35.750721 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9ab201e1-9ef3-485b-81f2-0b421dcc66cc","Type":"ContainerStarted","Data":"fe8ef285257cad8445b1f6f1fc80879753383454c33083ccb6fc4bc636252111"} Oct 03 15:01:35 crc kubenswrapper[4774]: I1003 15:01:35.751896 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2","Type":"ContainerStarted","Data":"e6e0b7db5f32e092e9c7de8ba6637494f34e877bad9b0ac96f37dbfff9dd0072"} Oct 03 15:01:35 crc kubenswrapper[4774]: I1003 15:01:35.939139 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-5rmz7"] Oct 03 15:01:35 crc kubenswrapper[4774]: I1003 15:01:35.949633 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-5rmz7"] Oct 03 15:01:35 crc kubenswrapper[4774]: I1003 15:01:35.986970 4774 scope.go:117] "RemoveContainer" containerID="c7870ac41844f9ef4459ef7e800f9d09de4f3d2a179f066890482f91ee2e5963" Oct 03 15:01:36 crc kubenswrapper[4774]: I1003 15:01:36.304839 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 03 15:01:36 crc kubenswrapper[4774]: I1003 15:01:36.769049 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2","Type":"ContainerStarted","Data":"c4aaf7233664f2fa254b7aa73d4880a832ac3a655c1a364b04e3d1b36460ce6a"} Oct 03 15:01:36 crc kubenswrapper[4774]: I1003 15:01:36.777423 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c986f6d7-fkddj" event={"ID":"87919e0f-0800-44c9-863a-903c14884ae8","Type":"ContainerStarted","Data":"00ea6331642e138a0bb3ee87ae1d65c9fd5cf5c7f4e018421fd61797ff725ee5"} Oct 03 15:01:36 crc kubenswrapper[4774]: I1003 15:01:36.778708 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69c986f6d7-fkddj" Oct 03 15:01:36 crc kubenswrapper[4774]: I1003 15:01:36.803526 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69c986f6d7-fkddj" podStartSLOduration=2.803501495 podStartE2EDuration="2.803501495s" podCreationTimestamp="2025-10-03 15:01:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:01:36.798097941 +0000 UTC m=+1119.387301393" watchObservedRunningTime="2025-10-03 15:01:36.803501495 +0000 UTC m=+1119.392704947" Oct 03 15:01:37 crc kubenswrapper[4774]: I1003 15:01:37.150405 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-66tpj" Oct 03 15:01:37 crc kubenswrapper[4774]: I1003 15:01:37.289523 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cdc24810-0778-4b37-8156-ecac9ae9e077-config\") pod \"cdc24810-0778-4b37-8156-ecac9ae9e077\" (UID: \"cdc24810-0778-4b37-8156-ecac9ae9e077\") " Oct 03 15:01:37 crc kubenswrapper[4774]: I1003 15:01:37.290065 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdc24810-0778-4b37-8156-ecac9ae9e077-combined-ca-bundle\") pod \"cdc24810-0778-4b37-8156-ecac9ae9e077\" (UID: \"cdc24810-0778-4b37-8156-ecac9ae9e077\") " Oct 03 15:01:37 crc kubenswrapper[4774]: I1003 15:01:37.290153 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qd6c6\" (UniqueName: \"kubernetes.io/projected/cdc24810-0778-4b37-8156-ecac9ae9e077-kube-api-access-qd6c6\") pod \"cdc24810-0778-4b37-8156-ecac9ae9e077\" (UID: \"cdc24810-0778-4b37-8156-ecac9ae9e077\") " Oct 03 15:01:37 crc kubenswrapper[4774]: I1003 15:01:37.294721 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdc24810-0778-4b37-8156-ecac9ae9e077-kube-api-access-qd6c6" (OuterVolumeSpecName: "kube-api-access-qd6c6") pod "cdc24810-0778-4b37-8156-ecac9ae9e077" (UID: "cdc24810-0778-4b37-8156-ecac9ae9e077"). InnerVolumeSpecName "kube-api-access-qd6c6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:01:37 crc kubenswrapper[4774]: I1003 15:01:37.321775 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdc24810-0778-4b37-8156-ecac9ae9e077-config" (OuterVolumeSpecName: "config") pod "cdc24810-0778-4b37-8156-ecac9ae9e077" (UID: "cdc24810-0778-4b37-8156-ecac9ae9e077"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:37 crc kubenswrapper[4774]: I1003 15:01:37.324209 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95263c08-34fe-4319-ae88-dc01b7609122" path="/var/lib/kubelet/pods/95263c08-34fe-4319-ae88-dc01b7609122/volumes" Oct 03 15:01:37 crc kubenswrapper[4774]: I1003 15:01:37.331023 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdc24810-0778-4b37-8156-ecac9ae9e077-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cdc24810-0778-4b37-8156-ecac9ae9e077" (UID: "cdc24810-0778-4b37-8156-ecac9ae9e077"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:37 crc kubenswrapper[4774]: I1003 15:01:37.392844 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qd6c6\" (UniqueName: \"kubernetes.io/projected/cdc24810-0778-4b37-8156-ecac9ae9e077-kube-api-access-qd6c6\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:37 crc kubenswrapper[4774]: I1003 15:01:37.393093 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/cdc24810-0778-4b37-8156-ecac9ae9e077-config\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:37 crc kubenswrapper[4774]: I1003 15:01:37.393193 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdc24810-0778-4b37-8156-ecac9ae9e077-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:37 crc kubenswrapper[4774]: I1003 15:01:37.797602 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-66tpj" event={"ID":"cdc24810-0778-4b37-8156-ecac9ae9e077","Type":"ContainerDied","Data":"6cfd5d5a159c06119132c546975544a01f9146a307480907fd8f623e2db3fb1c"} Oct 03 15:01:37 crc kubenswrapper[4774]: I1003 15:01:37.799293 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cfd5d5a159c06119132c546975544a01f9146a307480907fd8f623e2db3fb1c" Oct 03 15:01:37 crc kubenswrapper[4774]: I1003 15:01:37.797649 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-66tpj" Oct 03 15:01:37 crc kubenswrapper[4774]: I1003 15:01:37.807204 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9ab201e1-9ef3-485b-81f2-0b421dcc66cc","Type":"ContainerStarted","Data":"48840e45353d51bec85377ead7e54d4b8819f3ac710969f2fb5e297e370878d5"} Oct 03 15:01:37 crc kubenswrapper[4774]: I1003 15:01:37.810450 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="dc3d42b5-126b-433f-b677-5fc8a2f3f1e2" containerName="cinder-api-log" containerID="cri-o://c4aaf7233664f2fa254b7aa73d4880a832ac3a655c1a364b04e3d1b36460ce6a" gracePeriod=30 Oct 03 15:01:37 crc kubenswrapper[4774]: I1003 15:01:37.810931 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2","Type":"ContainerStarted","Data":"02aa17f3d57740d44f5d7c067ba1a7cb5fb13d2b911dba8b54cc52ba9bf13b45"} Oct 03 15:01:37 crc kubenswrapper[4774]: I1003 15:01:37.811060 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 03 15:01:37 crc kubenswrapper[4774]: I1003 15:01:37.811771 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="dc3d42b5-126b-433f-b677-5fc8a2f3f1e2" containerName="cinder-api" containerID="cri-o://02aa17f3d57740d44f5d7c067ba1a7cb5fb13d2b911dba8b54cc52ba9bf13b45" gracePeriod=30 Oct 03 15:01:37 crc kubenswrapper[4774]: I1003 15:01:37.854400 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.854358221 podStartE2EDuration="3.854358221s" podCreationTimestamp="2025-10-03 15:01:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:01:37.847516521 +0000 UTC m=+1120.436719973" watchObservedRunningTime="2025-10-03 15:01:37.854358221 +0000 UTC m=+1120.443561673" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.058215 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-fkddj"] Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.102965 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-lvqzl"] Oct 03 15:01:38 crc kubenswrapper[4774]: E1003 15:01:38.103530 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdc24810-0778-4b37-8156-ecac9ae9e077" containerName="neutron-db-sync" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.103549 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdc24810-0778-4b37-8156-ecac9ae9e077" containerName="neutron-db-sync" Oct 03 15:01:38 crc kubenswrapper[4774]: E1003 15:01:38.103566 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95263c08-34fe-4319-ae88-dc01b7609122" containerName="init" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.103573 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="95263c08-34fe-4319-ae88-dc01b7609122" containerName="init" Oct 03 15:01:38 crc kubenswrapper[4774]: E1003 15:01:38.103610 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95263c08-34fe-4319-ae88-dc01b7609122" containerName="dnsmasq-dns" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.103618 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="95263c08-34fe-4319-ae88-dc01b7609122" containerName="dnsmasq-dns" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.103838 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdc24810-0778-4b37-8156-ecac9ae9e077" containerName="neutron-db-sync" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.103858 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="95263c08-34fe-4319-ae88-dc01b7609122" containerName="dnsmasq-dns" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.105166 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-lvqzl" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.123433 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-lvqzl"] Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.217209 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/900d9b8b-8106-44c5-a25a-db56b0639d7f-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-lvqzl\" (UID: \"900d9b8b-8106-44c5-a25a-db56b0639d7f\") " pod="openstack/dnsmasq-dns-5784cf869f-lvqzl" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.217292 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/900d9b8b-8106-44c5-a25a-db56b0639d7f-dns-svc\") pod \"dnsmasq-dns-5784cf869f-lvqzl\" (UID: \"900d9b8b-8106-44c5-a25a-db56b0639d7f\") " pod="openstack/dnsmasq-dns-5784cf869f-lvqzl" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.217400 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/900d9b8b-8106-44c5-a25a-db56b0639d7f-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-lvqzl\" (UID: \"900d9b8b-8106-44c5-a25a-db56b0639d7f\") " pod="openstack/dnsmasq-dns-5784cf869f-lvqzl" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.217429 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/900d9b8b-8106-44c5-a25a-db56b0639d7f-config\") pod \"dnsmasq-dns-5784cf869f-lvqzl\" (UID: \"900d9b8b-8106-44c5-a25a-db56b0639d7f\") " pod="openstack/dnsmasq-dns-5784cf869f-lvqzl" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.217472 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/900d9b8b-8106-44c5-a25a-db56b0639d7f-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-lvqzl\" (UID: \"900d9b8b-8106-44c5-a25a-db56b0639d7f\") " pod="openstack/dnsmasq-dns-5784cf869f-lvqzl" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.217556 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56c6f\" (UniqueName: \"kubernetes.io/projected/900d9b8b-8106-44c5-a25a-db56b0639d7f-kube-api-access-56c6f\") pod \"dnsmasq-dns-5784cf869f-lvqzl\" (UID: \"900d9b8b-8106-44c5-a25a-db56b0639d7f\") " pod="openstack/dnsmasq-dns-5784cf869f-lvqzl" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.249206 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-77bb8d5544-lc44r"] Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.253311 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77bb8d5544-lc44r" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.257455 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.257728 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.257909 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-gmbc7" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.258051 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.277686 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-77bb8d5544-lc44r"] Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.320222 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c-config\") pod \"neutron-77bb8d5544-lc44r\" (UID: \"6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c\") " pod="openstack/neutron-77bb8d5544-lc44r" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.320279 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c-ovndb-tls-certs\") pod \"neutron-77bb8d5544-lc44r\" (UID: \"6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c\") " pod="openstack/neutron-77bb8d5544-lc44r" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.320343 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc2jk\" (UniqueName: \"kubernetes.io/projected/6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c-kube-api-access-dc2jk\") pod \"neutron-77bb8d5544-lc44r\" (UID: \"6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c\") " pod="openstack/neutron-77bb8d5544-lc44r" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.320400 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/900d9b8b-8106-44c5-a25a-db56b0639d7f-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-lvqzl\" (UID: \"900d9b8b-8106-44c5-a25a-db56b0639d7f\") " pod="openstack/dnsmasq-dns-5784cf869f-lvqzl" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.320471 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/900d9b8b-8106-44c5-a25a-db56b0639d7f-dns-svc\") pod \"dnsmasq-dns-5784cf869f-lvqzl\" (UID: \"900d9b8b-8106-44c5-a25a-db56b0639d7f\") " pod="openstack/dnsmasq-dns-5784cf869f-lvqzl" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.320549 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c-httpd-config\") pod \"neutron-77bb8d5544-lc44r\" (UID: \"6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c\") " pod="openstack/neutron-77bb8d5544-lc44r" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.320598 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/900d9b8b-8106-44c5-a25a-db56b0639d7f-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-lvqzl\" (UID: \"900d9b8b-8106-44c5-a25a-db56b0639d7f\") " pod="openstack/dnsmasq-dns-5784cf869f-lvqzl" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.320622 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c-combined-ca-bundle\") pod \"neutron-77bb8d5544-lc44r\" (UID: \"6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c\") " pod="openstack/neutron-77bb8d5544-lc44r" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.320761 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/900d9b8b-8106-44c5-a25a-db56b0639d7f-config\") pod \"dnsmasq-dns-5784cf869f-lvqzl\" (UID: \"900d9b8b-8106-44c5-a25a-db56b0639d7f\") " pod="openstack/dnsmasq-dns-5784cf869f-lvqzl" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.320802 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/900d9b8b-8106-44c5-a25a-db56b0639d7f-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-lvqzl\" (UID: \"900d9b8b-8106-44c5-a25a-db56b0639d7f\") " pod="openstack/dnsmasq-dns-5784cf869f-lvqzl" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.320833 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56c6f\" (UniqueName: \"kubernetes.io/projected/900d9b8b-8106-44c5-a25a-db56b0639d7f-kube-api-access-56c6f\") pod \"dnsmasq-dns-5784cf869f-lvqzl\" (UID: \"900d9b8b-8106-44c5-a25a-db56b0639d7f\") " pod="openstack/dnsmasq-dns-5784cf869f-lvqzl" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.322055 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/900d9b8b-8106-44c5-a25a-db56b0639d7f-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-lvqzl\" (UID: \"900d9b8b-8106-44c5-a25a-db56b0639d7f\") " pod="openstack/dnsmasq-dns-5784cf869f-lvqzl" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.322706 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/900d9b8b-8106-44c5-a25a-db56b0639d7f-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-lvqzl\" (UID: \"900d9b8b-8106-44c5-a25a-db56b0639d7f\") " pod="openstack/dnsmasq-dns-5784cf869f-lvqzl" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.323511 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/900d9b8b-8106-44c5-a25a-db56b0639d7f-config\") pod \"dnsmasq-dns-5784cf869f-lvqzl\" (UID: \"900d9b8b-8106-44c5-a25a-db56b0639d7f\") " pod="openstack/dnsmasq-dns-5784cf869f-lvqzl" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.323725 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/900d9b8b-8106-44c5-a25a-db56b0639d7f-dns-svc\") pod \"dnsmasq-dns-5784cf869f-lvqzl\" (UID: \"900d9b8b-8106-44c5-a25a-db56b0639d7f\") " pod="openstack/dnsmasq-dns-5784cf869f-lvqzl" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.326796 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/900d9b8b-8106-44c5-a25a-db56b0639d7f-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-lvqzl\" (UID: \"900d9b8b-8106-44c5-a25a-db56b0639d7f\") " pod="openstack/dnsmasq-dns-5784cf869f-lvqzl" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.388451 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56c6f\" (UniqueName: \"kubernetes.io/projected/900d9b8b-8106-44c5-a25a-db56b0639d7f-kube-api-access-56c6f\") pod \"dnsmasq-dns-5784cf869f-lvqzl\" (UID: \"900d9b8b-8106-44c5-a25a-db56b0639d7f\") " pod="openstack/dnsmasq-dns-5784cf869f-lvqzl" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.422838 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c-combined-ca-bundle\") pod \"neutron-77bb8d5544-lc44r\" (UID: \"6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c\") " pod="openstack/neutron-77bb8d5544-lc44r" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.423136 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c-config\") pod \"neutron-77bb8d5544-lc44r\" (UID: \"6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c\") " pod="openstack/neutron-77bb8d5544-lc44r" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.423184 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c-ovndb-tls-certs\") pod \"neutron-77bb8d5544-lc44r\" (UID: \"6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c\") " pod="openstack/neutron-77bb8d5544-lc44r" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.423271 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc2jk\" (UniqueName: \"kubernetes.io/projected/6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c-kube-api-access-dc2jk\") pod \"neutron-77bb8d5544-lc44r\" (UID: \"6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c\") " pod="openstack/neutron-77bb8d5544-lc44r" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.423437 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c-httpd-config\") pod \"neutron-77bb8d5544-lc44r\" (UID: \"6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c\") " pod="openstack/neutron-77bb8d5544-lc44r" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.429535 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c-ovndb-tls-certs\") pod \"neutron-77bb8d5544-lc44r\" (UID: \"6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c\") " pod="openstack/neutron-77bb8d5544-lc44r" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.430548 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c-httpd-config\") pod \"neutron-77bb8d5544-lc44r\" (UID: \"6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c\") " pod="openstack/neutron-77bb8d5544-lc44r" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.433032 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c-combined-ca-bundle\") pod \"neutron-77bb8d5544-lc44r\" (UID: \"6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c\") " pod="openstack/neutron-77bb8d5544-lc44r" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.436570 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c-config\") pod \"neutron-77bb8d5544-lc44r\" (UID: \"6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c\") " pod="openstack/neutron-77bb8d5544-lc44r" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.453831 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc2jk\" (UniqueName: \"kubernetes.io/projected/6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c-kube-api-access-dc2jk\") pod \"neutron-77bb8d5544-lc44r\" (UID: \"6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c\") " pod="openstack/neutron-77bb8d5544-lc44r" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.504950 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-lvqzl" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.601132 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77bb8d5544-lc44r" Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.822992 4774 generic.go:334] "Generic (PLEG): container finished" podID="dc3d42b5-126b-433f-b677-5fc8a2f3f1e2" containerID="02aa17f3d57740d44f5d7c067ba1a7cb5fb13d2b911dba8b54cc52ba9bf13b45" exitCode=0 Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.824091 4774 generic.go:334] "Generic (PLEG): container finished" podID="dc3d42b5-126b-433f-b677-5fc8a2f3f1e2" containerID="c4aaf7233664f2fa254b7aa73d4880a832ac3a655c1a364b04e3d1b36460ce6a" exitCode=143 Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.823120 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2","Type":"ContainerDied","Data":"02aa17f3d57740d44f5d7c067ba1a7cb5fb13d2b911dba8b54cc52ba9bf13b45"} Oct 03 15:01:38 crc kubenswrapper[4774]: I1003 15:01:38.824311 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2","Type":"ContainerDied","Data":"c4aaf7233664f2fa254b7aa73d4880a832ac3a655c1a364b04e3d1b36460ce6a"} Oct 03 15:01:39 crc kubenswrapper[4774]: I1003 15:01:39.067997 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-78d8f8854b-zdr5d" Oct 03 15:01:39 crc kubenswrapper[4774]: I1003 15:01:39.262040 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-78d8f8854b-zdr5d" Oct 03 15:01:39 crc kubenswrapper[4774]: I1003 15:01:39.831606 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-69c986f6d7-fkddj" podUID="87919e0f-0800-44c9-863a-903c14884ae8" containerName="dnsmasq-dns" containerID="cri-o://00ea6331642e138a0bb3ee87ae1d65c9fd5cf5c7f4e018421fd61797ff725ee5" gracePeriod=10 Oct 03 15:01:40 crc kubenswrapper[4774]: I1003 15:01:40.551899 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-lvqzl"] Oct 03 15:01:40 crc kubenswrapper[4774]: I1003 15:01:40.826450 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-987467b4f-dts4l"] Oct 03 15:01:40 crc kubenswrapper[4774]: I1003 15:01:40.828589 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-987467b4f-dts4l" Oct 03 15:01:40 crc kubenswrapper[4774]: I1003 15:01:40.846263 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 03 15:01:40 crc kubenswrapper[4774]: I1003 15:01:40.846484 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 03 15:01:40 crc kubenswrapper[4774]: I1003 15:01:40.865256 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-987467b4f-dts4l"] Oct 03 15:01:40 crc kubenswrapper[4774]: I1003 15:01:40.921483 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c5ff7df6-e04c-431b-bdbb-2579172a7706-httpd-config\") pod \"neutron-987467b4f-dts4l\" (UID: \"c5ff7df6-e04c-431b-bdbb-2579172a7706\") " pod="openstack/neutron-987467b4f-dts4l" Oct 03 15:01:40 crc kubenswrapper[4774]: I1003 15:01:40.921532 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c5ff7df6-e04c-431b-bdbb-2579172a7706-config\") pod \"neutron-987467b4f-dts4l\" (UID: \"c5ff7df6-e04c-431b-bdbb-2579172a7706\") " pod="openstack/neutron-987467b4f-dts4l" Oct 03 15:01:40 crc kubenswrapper[4774]: I1003 15:01:40.921572 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5ff7df6-e04c-431b-bdbb-2579172a7706-public-tls-certs\") pod \"neutron-987467b4f-dts4l\" (UID: \"c5ff7df6-e04c-431b-bdbb-2579172a7706\") " pod="openstack/neutron-987467b4f-dts4l" Oct 03 15:01:40 crc kubenswrapper[4774]: I1003 15:01:40.921599 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5ff7df6-e04c-431b-bdbb-2579172a7706-ovndb-tls-certs\") pod \"neutron-987467b4f-dts4l\" (UID: \"c5ff7df6-e04c-431b-bdbb-2579172a7706\") " pod="openstack/neutron-987467b4f-dts4l" Oct 03 15:01:40 crc kubenswrapper[4774]: I1003 15:01:40.921638 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5ff7df6-e04c-431b-bdbb-2579172a7706-combined-ca-bundle\") pod \"neutron-987467b4f-dts4l\" (UID: \"c5ff7df6-e04c-431b-bdbb-2579172a7706\") " pod="openstack/neutron-987467b4f-dts4l" Oct 03 15:01:40 crc kubenswrapper[4774]: I1003 15:01:40.921667 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd7q2\" (UniqueName: \"kubernetes.io/projected/c5ff7df6-e04c-431b-bdbb-2579172a7706-kube-api-access-sd7q2\") pod \"neutron-987467b4f-dts4l\" (UID: \"c5ff7df6-e04c-431b-bdbb-2579172a7706\") " pod="openstack/neutron-987467b4f-dts4l" Oct 03 15:01:40 crc kubenswrapper[4774]: I1003 15:01:40.921697 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5ff7df6-e04c-431b-bdbb-2579172a7706-internal-tls-certs\") pod \"neutron-987467b4f-dts4l\" (UID: \"c5ff7df6-e04c-431b-bdbb-2579172a7706\") " pod="openstack/neutron-987467b4f-dts4l" Oct 03 15:01:40 crc kubenswrapper[4774]: I1003 15:01:40.925699 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9ab201e1-9ef3-485b-81f2-0b421dcc66cc","Type":"ContainerStarted","Data":"cec1c430a7ebe4634f855fdd18afe1dff2fbf97054719268f31d4a4bfdb5c9e9"} Oct 03 15:01:40 crc kubenswrapper[4774]: I1003 15:01:40.954100 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 15:01:40 crc kubenswrapper[4774]: I1003 15:01:40.956295 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-lvqzl" event={"ID":"900d9b8b-8106-44c5-a25a-db56b0639d7f","Type":"ContainerStarted","Data":"6ea16e4d0e7a42989f3c5673a7df349930d3c139091e35d990d35f6db25d78ad"} Oct 03 15:01:40 crc kubenswrapper[4774]: I1003 15:01:40.974440 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=7.025988397 podStartE2EDuration="7.974423624s" podCreationTimestamp="2025-10-03 15:01:33 +0000 UTC" firstStartedPulling="2025-10-03 15:01:35.092043145 +0000 UTC m=+1117.681246597" lastFinishedPulling="2025-10-03 15:01:36.040478352 +0000 UTC m=+1118.629681824" observedRunningTime="2025-10-03 15:01:40.970850705 +0000 UTC m=+1123.560054157" watchObservedRunningTime="2025-10-03 15:01:40.974423624 +0000 UTC m=+1123.563627076" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:40.985244 4774 generic.go:334] "Generic (PLEG): container finished" podID="87919e0f-0800-44c9-863a-903c14884ae8" containerID="00ea6331642e138a0bb3ee87ae1d65c9fd5cf5c7f4e018421fd61797ff725ee5" exitCode=0 Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:40.985295 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c986f6d7-fkddj" event={"ID":"87919e0f-0800-44c9-863a-903c14884ae8","Type":"ContainerDied","Data":"00ea6331642e138a0bb3ee87ae1d65c9fd5cf5c7f4e018421fd61797ff725ee5"} Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.023465 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-logs\") pod \"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2\" (UID: \"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2\") " Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.023525 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6k7w\" (UniqueName: \"kubernetes.io/projected/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-kube-api-access-z6k7w\") pod \"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2\" (UID: \"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2\") " Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.023559 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-config-data\") pod \"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2\" (UID: \"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2\") " Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.023628 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-etc-machine-id\") pod \"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2\" (UID: \"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2\") " Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.023684 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-scripts\") pod \"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2\" (UID: \"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2\") " Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.023753 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-config-data-custom\") pod \"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2\" (UID: \"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2\") " Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.023780 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-combined-ca-bundle\") pod \"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2\" (UID: \"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2\") " Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.023787 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "dc3d42b5-126b-433f-b677-5fc8a2f3f1e2" (UID: "dc3d42b5-126b-433f-b677-5fc8a2f3f1e2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.024255 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-logs" (OuterVolumeSpecName: "logs") pod "dc3d42b5-126b-433f-b677-5fc8a2f3f1e2" (UID: "dc3d42b5-126b-433f-b677-5fc8a2f3f1e2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.030433 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c5ff7df6-e04c-431b-bdbb-2579172a7706-httpd-config\") pod \"neutron-987467b4f-dts4l\" (UID: \"c5ff7df6-e04c-431b-bdbb-2579172a7706\") " pod="openstack/neutron-987467b4f-dts4l" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.030533 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c5ff7df6-e04c-431b-bdbb-2579172a7706-config\") pod \"neutron-987467b4f-dts4l\" (UID: \"c5ff7df6-e04c-431b-bdbb-2579172a7706\") " pod="openstack/neutron-987467b4f-dts4l" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.030555 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5ff7df6-e04c-431b-bdbb-2579172a7706-public-tls-certs\") pod \"neutron-987467b4f-dts4l\" (UID: \"c5ff7df6-e04c-431b-bdbb-2579172a7706\") " pod="openstack/neutron-987467b4f-dts4l" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.030607 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5ff7df6-e04c-431b-bdbb-2579172a7706-ovndb-tls-certs\") pod \"neutron-987467b4f-dts4l\" (UID: \"c5ff7df6-e04c-431b-bdbb-2579172a7706\") " pod="openstack/neutron-987467b4f-dts4l" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.030690 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5ff7df6-e04c-431b-bdbb-2579172a7706-combined-ca-bundle\") pod \"neutron-987467b4f-dts4l\" (UID: \"c5ff7df6-e04c-431b-bdbb-2579172a7706\") " pod="openstack/neutron-987467b4f-dts4l" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.030758 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd7q2\" (UniqueName: \"kubernetes.io/projected/c5ff7df6-e04c-431b-bdbb-2579172a7706-kube-api-access-sd7q2\") pod \"neutron-987467b4f-dts4l\" (UID: \"c5ff7df6-e04c-431b-bdbb-2579172a7706\") " pod="openstack/neutron-987467b4f-dts4l" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.030824 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5ff7df6-e04c-431b-bdbb-2579172a7706-internal-tls-certs\") pod \"neutron-987467b4f-dts4l\" (UID: \"c5ff7df6-e04c-431b-bdbb-2579172a7706\") " pod="openstack/neutron-987467b4f-dts4l" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.031040 4774 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-logs\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.031052 4774 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.039463 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "dc3d42b5-126b-433f-b677-5fc8a2f3f1e2" (UID: "dc3d42b5-126b-433f-b677-5fc8a2f3f1e2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.095418 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c5ff7df6-e04c-431b-bdbb-2579172a7706-httpd-config\") pod \"neutron-987467b4f-dts4l\" (UID: \"c5ff7df6-e04c-431b-bdbb-2579172a7706\") " pod="openstack/neutron-987467b4f-dts4l" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.095527 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c5ff7df6-e04c-431b-bdbb-2579172a7706-config\") pod \"neutron-987467b4f-dts4l\" (UID: \"c5ff7df6-e04c-431b-bdbb-2579172a7706\") " pod="openstack/neutron-987467b4f-dts4l" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.096352 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5ff7df6-e04c-431b-bdbb-2579172a7706-ovndb-tls-certs\") pod \"neutron-987467b4f-dts4l\" (UID: \"c5ff7df6-e04c-431b-bdbb-2579172a7706\") " pod="openstack/neutron-987467b4f-dts4l" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.096430 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5ff7df6-e04c-431b-bdbb-2579172a7706-combined-ca-bundle\") pod \"neutron-987467b4f-dts4l\" (UID: \"c5ff7df6-e04c-431b-bdbb-2579172a7706\") " pod="openstack/neutron-987467b4f-dts4l" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.096821 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5ff7df6-e04c-431b-bdbb-2579172a7706-public-tls-certs\") pod \"neutron-987467b4f-dts4l\" (UID: \"c5ff7df6-e04c-431b-bdbb-2579172a7706\") " pod="openstack/neutron-987467b4f-dts4l" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.109483 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd7q2\" (UniqueName: \"kubernetes.io/projected/c5ff7df6-e04c-431b-bdbb-2579172a7706-kube-api-access-sd7q2\") pod \"neutron-987467b4f-dts4l\" (UID: \"c5ff7df6-e04c-431b-bdbb-2579172a7706\") " pod="openstack/neutron-987467b4f-dts4l" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.112489 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5ff7df6-e04c-431b-bdbb-2579172a7706-internal-tls-certs\") pod \"neutron-987467b4f-dts4l\" (UID: \"c5ff7df6-e04c-431b-bdbb-2579172a7706\") " pod="openstack/neutron-987467b4f-dts4l" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.113078 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-scripts" (OuterVolumeSpecName: "scripts") pod "dc3d42b5-126b-433f-b677-5fc8a2f3f1e2" (UID: "dc3d42b5-126b-433f-b677-5fc8a2f3f1e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.140452 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.140490 4774 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.157742 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-kube-api-access-z6k7w" (OuterVolumeSpecName: "kube-api-access-z6k7w") pod "dc3d42b5-126b-433f-b677-5fc8a2f3f1e2" (UID: "dc3d42b5-126b-433f-b677-5fc8a2f3f1e2"). InnerVolumeSpecName "kube-api-access-z6k7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.177949 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc3d42b5-126b-433f-b677-5fc8a2f3f1e2" (UID: "dc3d42b5-126b-433f-b677-5fc8a2f3f1e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.210616 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-987467b4f-dts4l" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.242124 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6k7w\" (UniqueName: \"kubernetes.io/projected/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-kube-api-access-z6k7w\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.242162 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.244469 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-config-data" (OuterVolumeSpecName: "config-data") pod "dc3d42b5-126b-433f-b677-5fc8a2f3f1e2" (UID: "dc3d42b5-126b-433f-b677-5fc8a2f3f1e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.344438 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.394599 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c986f6d7-fkddj" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.409239 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-77bb8d5544-lc44r"] Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.445337 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87919e0f-0800-44c9-863a-903c14884ae8-dns-svc\") pod \"87919e0f-0800-44c9-863a-903c14884ae8\" (UID: \"87919e0f-0800-44c9-863a-903c14884ae8\") " Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.445705 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87919e0f-0800-44c9-863a-903c14884ae8-ovsdbserver-sb\") pod \"87919e0f-0800-44c9-863a-903c14884ae8\" (UID: \"87919e0f-0800-44c9-863a-903c14884ae8\") " Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.445900 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87919e0f-0800-44c9-863a-903c14884ae8-dns-swift-storage-0\") pod \"87919e0f-0800-44c9-863a-903c14884ae8\" (UID: \"87919e0f-0800-44c9-863a-903c14884ae8\") " Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.445974 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87919e0f-0800-44c9-863a-903c14884ae8-config\") pod \"87919e0f-0800-44c9-863a-903c14884ae8\" (UID: \"87919e0f-0800-44c9-863a-903c14884ae8\") " Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.446313 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7vxx\" (UniqueName: \"kubernetes.io/projected/87919e0f-0800-44c9-863a-903c14884ae8-kube-api-access-s7vxx\") pod \"87919e0f-0800-44c9-863a-903c14884ae8\" (UID: \"87919e0f-0800-44c9-863a-903c14884ae8\") " Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.446394 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87919e0f-0800-44c9-863a-903c14884ae8-ovsdbserver-nb\") pod \"87919e0f-0800-44c9-863a-903c14884ae8\" (UID: \"87919e0f-0800-44c9-863a-903c14884ae8\") " Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.464170 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87919e0f-0800-44c9-863a-903c14884ae8-kube-api-access-s7vxx" (OuterVolumeSpecName: "kube-api-access-s7vxx") pod "87919e0f-0800-44c9-863a-903c14884ae8" (UID: "87919e0f-0800-44c9-863a-903c14884ae8"). InnerVolumeSpecName "kube-api-access-s7vxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.548065 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87919e0f-0800-44c9-863a-903c14884ae8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "87919e0f-0800-44c9-863a-903c14884ae8" (UID: "87919e0f-0800-44c9-863a-903c14884ae8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.549124 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7vxx\" (UniqueName: \"kubernetes.io/projected/87919e0f-0800-44c9-863a-903c14884ae8-kube-api-access-s7vxx\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.549152 4774 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87919e0f-0800-44c9-863a-903c14884ae8-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.554049 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87919e0f-0800-44c9-863a-903c14884ae8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "87919e0f-0800-44c9-863a-903c14884ae8" (UID: "87919e0f-0800-44c9-863a-903c14884ae8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.559809 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87919e0f-0800-44c9-863a-903c14884ae8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "87919e0f-0800-44c9-863a-903c14884ae8" (UID: "87919e0f-0800-44c9-863a-903c14884ae8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.580132 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87919e0f-0800-44c9-863a-903c14884ae8-config" (OuterVolumeSpecName: "config") pod "87919e0f-0800-44c9-863a-903c14884ae8" (UID: "87919e0f-0800-44c9-863a-903c14884ae8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.590859 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87919e0f-0800-44c9-863a-903c14884ae8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "87919e0f-0800-44c9-863a-903c14884ae8" (UID: "87919e0f-0800-44c9-863a-903c14884ae8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.613240 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-bc5bdf456-xt2x4" podUID="871f7d16-54b6-4aa9-8e99-00a888d41f70" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.652201 4774 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87919e0f-0800-44c9-863a-903c14884ae8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.652231 4774 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87919e0f-0800-44c9-863a-903c14884ae8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.652243 4774 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87919e0f-0800-44c9-863a-903c14884ae8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.652254 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87919e0f-0800-44c9-863a-903c14884ae8-config\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.852759 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7444dd849d-z82k5" Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.884870 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-987467b4f-dts4l"] Oct 03 15:01:41 crc kubenswrapper[4774]: I1003 15:01:41.999203 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-987467b4f-dts4l" event={"ID":"c5ff7df6-e04c-431b-bdbb-2579172a7706","Type":"ContainerStarted","Data":"15193c6907af8717563031a0fc208f00c68aa24a1e4b721a483ba787c673f6b7"} Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.005941 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.006642 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dc3d42b5-126b-433f-b677-5fc8a2f3f1e2","Type":"ContainerDied","Data":"e6e0b7db5f32e092e9c7de8ba6637494f34e877bad9b0ac96f37dbfff9dd0072"} Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.006686 4774 scope.go:117] "RemoveContainer" containerID="02aa17f3d57740d44f5d7c067ba1a7cb5fb13d2b911dba8b54cc52ba9bf13b45" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.013298 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77bb8d5544-lc44r" event={"ID":"6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c","Type":"ContainerStarted","Data":"4fbe9fa375c2de6d09e0551a0d46765f1b6fc379dc2b0650a0d993bd9a9f8d9f"} Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.013333 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77bb8d5544-lc44r" event={"ID":"6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c","Type":"ContainerStarted","Data":"6abece3013abb91833b7834b0416ec0d019a4edbf03c419fb55d137beffaa70a"} Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.013354 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77bb8d5544-lc44r" event={"ID":"6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c","Type":"ContainerStarted","Data":"cd6cfbc3bcde1deb6f275865320f661a41ba64b3f82471351552d04b5a74a4c5"} Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.013390 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-77bb8d5544-lc44r" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.015515 4774 generic.go:334] "Generic (PLEG): container finished" podID="900d9b8b-8106-44c5-a25a-db56b0639d7f" containerID="af096e38e623058c9881f6275cb8f7c8a76aba4c10e8869a317a763a6e7d1dd9" exitCode=0 Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.015559 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-lvqzl" event={"ID":"900d9b8b-8106-44c5-a25a-db56b0639d7f","Type":"ContainerDied","Data":"af096e38e623058c9881f6275cb8f7c8a76aba4c10e8869a317a763a6e7d1dd9"} Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.027528 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c986f6d7-fkddj" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.027523 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c986f6d7-fkddj" event={"ID":"87919e0f-0800-44c9-863a-903c14884ae8","Type":"ContainerDied","Data":"557a3407c5ed0384afe9bfa3f1ca54193ff85e1b18e03df7017b6c2f3f3011fa"} Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.045554 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-77bb8d5544-lc44r" podStartSLOduration=4.045536342 podStartE2EDuration="4.045536342s" podCreationTimestamp="2025-10-03 15:01:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:01:42.03981536 +0000 UTC m=+1124.629018812" watchObservedRunningTime="2025-10-03 15:01:42.045536342 +0000 UTC m=+1124.634739794" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.209150 4774 scope.go:117] "RemoveContainer" containerID="c4aaf7233664f2fa254b7aa73d4880a832ac3a655c1a364b04e3d1b36460ce6a" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.249338 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-fkddj"] Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.251574 4774 scope.go:117] "RemoveContainer" containerID="00ea6331642e138a0bb3ee87ae1d65c9fd5cf5c7f4e018421fd61797ff725ee5" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.268492 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-fkddj"] Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.276200 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.290530 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.293884 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 03 15:01:42 crc kubenswrapper[4774]: E1003 15:01:42.294244 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87919e0f-0800-44c9-863a-903c14884ae8" containerName="init" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.294259 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="87919e0f-0800-44c9-863a-903c14884ae8" containerName="init" Oct 03 15:01:42 crc kubenswrapper[4774]: E1003 15:01:42.294284 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87919e0f-0800-44c9-863a-903c14884ae8" containerName="dnsmasq-dns" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.294290 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="87919e0f-0800-44c9-863a-903c14884ae8" containerName="dnsmasq-dns" Oct 03 15:01:42 crc kubenswrapper[4774]: E1003 15:01:42.294312 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc3d42b5-126b-433f-b677-5fc8a2f3f1e2" containerName="cinder-api" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.294319 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc3d42b5-126b-433f-b677-5fc8a2f3f1e2" containerName="cinder-api" Oct 03 15:01:42 crc kubenswrapper[4774]: E1003 15:01:42.294339 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc3d42b5-126b-433f-b677-5fc8a2f3f1e2" containerName="cinder-api-log" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.294357 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc3d42b5-126b-433f-b677-5fc8a2f3f1e2" containerName="cinder-api-log" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.294664 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="87919e0f-0800-44c9-863a-903c14884ae8" containerName="dnsmasq-dns" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.294686 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc3d42b5-126b-433f-b677-5fc8a2f3f1e2" containerName="cinder-api-log" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.294704 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc3d42b5-126b-433f-b677-5fc8a2f3f1e2" containerName="cinder-api" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.295336 4774 scope.go:117] "RemoveContainer" containerID="619db68667a34acd223ae1773ed5cc9757e9f64da3f5d42e6da87c5a988331fb" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.295599 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.299705 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.299895 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.300003 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.334443 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.366628 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phmmz\" (UniqueName: \"kubernetes.io/projected/c2605586-9dec-4e4f-a61d-7a93535cbaa2-kube-api-access-phmmz\") pod \"cinder-api-0\" (UID: \"c2605586-9dec-4e4f-a61d-7a93535cbaa2\") " pod="openstack/cinder-api-0" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.366670 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2605586-9dec-4e4f-a61d-7a93535cbaa2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c2605586-9dec-4e4f-a61d-7a93535cbaa2\") " pod="openstack/cinder-api-0" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.366789 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2605586-9dec-4e4f-a61d-7a93535cbaa2-logs\") pod \"cinder-api-0\" (UID: \"c2605586-9dec-4e4f-a61d-7a93535cbaa2\") " pod="openstack/cinder-api-0" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.366808 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2605586-9dec-4e4f-a61d-7a93535cbaa2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c2605586-9dec-4e4f-a61d-7a93535cbaa2\") " pod="openstack/cinder-api-0" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.366827 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2605586-9dec-4e4f-a61d-7a93535cbaa2-scripts\") pod \"cinder-api-0\" (UID: \"c2605586-9dec-4e4f-a61d-7a93535cbaa2\") " pod="openstack/cinder-api-0" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.366866 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2605586-9dec-4e4f-a61d-7a93535cbaa2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c2605586-9dec-4e4f-a61d-7a93535cbaa2\") " pod="openstack/cinder-api-0" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.366964 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2605586-9dec-4e4f-a61d-7a93535cbaa2-config-data\") pod \"cinder-api-0\" (UID: \"c2605586-9dec-4e4f-a61d-7a93535cbaa2\") " pod="openstack/cinder-api-0" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.367009 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2605586-9dec-4e4f-a61d-7a93535cbaa2-config-data-custom\") pod \"cinder-api-0\" (UID: \"c2605586-9dec-4e4f-a61d-7a93535cbaa2\") " pod="openstack/cinder-api-0" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.367031 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c2605586-9dec-4e4f-a61d-7a93535cbaa2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c2605586-9dec-4e4f-a61d-7a93535cbaa2\") " pod="openstack/cinder-api-0" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.468740 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2605586-9dec-4e4f-a61d-7a93535cbaa2-logs\") pod \"cinder-api-0\" (UID: \"c2605586-9dec-4e4f-a61d-7a93535cbaa2\") " pod="openstack/cinder-api-0" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.468791 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2605586-9dec-4e4f-a61d-7a93535cbaa2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c2605586-9dec-4e4f-a61d-7a93535cbaa2\") " pod="openstack/cinder-api-0" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.468820 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2605586-9dec-4e4f-a61d-7a93535cbaa2-scripts\") pod \"cinder-api-0\" (UID: \"c2605586-9dec-4e4f-a61d-7a93535cbaa2\") " pod="openstack/cinder-api-0" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.468866 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2605586-9dec-4e4f-a61d-7a93535cbaa2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c2605586-9dec-4e4f-a61d-7a93535cbaa2\") " pod="openstack/cinder-api-0" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.468891 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2605586-9dec-4e4f-a61d-7a93535cbaa2-config-data\") pod \"cinder-api-0\" (UID: \"c2605586-9dec-4e4f-a61d-7a93535cbaa2\") " pod="openstack/cinder-api-0" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.468939 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2605586-9dec-4e4f-a61d-7a93535cbaa2-config-data-custom\") pod \"cinder-api-0\" (UID: \"c2605586-9dec-4e4f-a61d-7a93535cbaa2\") " pod="openstack/cinder-api-0" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.468968 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c2605586-9dec-4e4f-a61d-7a93535cbaa2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c2605586-9dec-4e4f-a61d-7a93535cbaa2\") " pod="openstack/cinder-api-0" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.469008 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phmmz\" (UniqueName: \"kubernetes.io/projected/c2605586-9dec-4e4f-a61d-7a93535cbaa2-kube-api-access-phmmz\") pod \"cinder-api-0\" (UID: \"c2605586-9dec-4e4f-a61d-7a93535cbaa2\") " pod="openstack/cinder-api-0" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.469034 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2605586-9dec-4e4f-a61d-7a93535cbaa2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c2605586-9dec-4e4f-a61d-7a93535cbaa2\") " pod="openstack/cinder-api-0" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.469775 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c2605586-9dec-4e4f-a61d-7a93535cbaa2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c2605586-9dec-4e4f-a61d-7a93535cbaa2\") " pod="openstack/cinder-api-0" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.469985 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2605586-9dec-4e4f-a61d-7a93535cbaa2-logs\") pod \"cinder-api-0\" (UID: \"c2605586-9dec-4e4f-a61d-7a93535cbaa2\") " pod="openstack/cinder-api-0" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.473649 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2605586-9dec-4e4f-a61d-7a93535cbaa2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c2605586-9dec-4e4f-a61d-7a93535cbaa2\") " pod="openstack/cinder-api-0" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.473650 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2605586-9dec-4e4f-a61d-7a93535cbaa2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c2605586-9dec-4e4f-a61d-7a93535cbaa2\") " pod="openstack/cinder-api-0" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.473969 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2605586-9dec-4e4f-a61d-7a93535cbaa2-scripts\") pod \"cinder-api-0\" (UID: \"c2605586-9dec-4e4f-a61d-7a93535cbaa2\") " pod="openstack/cinder-api-0" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.477624 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2605586-9dec-4e4f-a61d-7a93535cbaa2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c2605586-9dec-4e4f-a61d-7a93535cbaa2\") " pod="openstack/cinder-api-0" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.478135 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2605586-9dec-4e4f-a61d-7a93535cbaa2-config-data-custom\") pod \"cinder-api-0\" (UID: \"c2605586-9dec-4e4f-a61d-7a93535cbaa2\") " pod="openstack/cinder-api-0" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.478231 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2605586-9dec-4e4f-a61d-7a93535cbaa2-config-data\") pod \"cinder-api-0\" (UID: \"c2605586-9dec-4e4f-a61d-7a93535cbaa2\") " pod="openstack/cinder-api-0" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.487564 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phmmz\" (UniqueName: \"kubernetes.io/projected/c2605586-9dec-4e4f-a61d-7a93535cbaa2-kube-api-access-phmmz\") pod \"cinder-api-0\" (UID: \"c2605586-9dec-4e4f-a61d-7a93535cbaa2\") " pod="openstack/cinder-api-0" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.552844 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5587b8897d-cknc7" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.629837 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.893140 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5587b8897d-cknc7" Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.952258 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-78d8f8854b-zdr5d"] Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.952581 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-78d8f8854b-zdr5d" podUID="d926f2ba-fdb2-4eb0-a509-5b9eb13ed626" containerName="barbican-api-log" containerID="cri-o://975761272e9cd5eaae32f0a2f20a24e0dd1eb200f15996617c757a59662d1f70" gracePeriod=30 Oct 03 15:01:42 crc kubenswrapper[4774]: I1003 15:01:42.952752 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-78d8f8854b-zdr5d" podUID="d926f2ba-fdb2-4eb0-a509-5b9eb13ed626" containerName="barbican-api" containerID="cri-o://5625eea49c42c6a502d69e96000ba3039668090d6a81040b9b904c035bd8a32c" gracePeriod=30 Oct 03 15:01:43 crc kubenswrapper[4774]: I1003 15:01:43.055314 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-987467b4f-dts4l" event={"ID":"c5ff7df6-e04c-431b-bdbb-2579172a7706","Type":"ContainerStarted","Data":"230e8d32fed8a95335c4e8e8950e8429b7d55e6bd5789cb4ac6e7bbf3140ebe1"} Oct 03 15:01:43 crc kubenswrapper[4774]: I1003 15:01:43.055686 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-987467b4f-dts4l" event={"ID":"c5ff7df6-e04c-431b-bdbb-2579172a7706","Type":"ContainerStarted","Data":"5b6aad8e672adb658832365fd6b072891d5b30a4e7c45474a5b55421f53bdf8b"} Oct 03 15:01:43 crc kubenswrapper[4774]: I1003 15:01:43.055735 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-987467b4f-dts4l" Oct 03 15:01:43 crc kubenswrapper[4774]: I1003 15:01:43.065014 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-lvqzl" event={"ID":"900d9b8b-8106-44c5-a25a-db56b0639d7f","Type":"ContainerStarted","Data":"19b2a88f3ad1e8d5bb7b074b82327f25e7eee7aa1dfd2f202b6eb0f21e8b545e"} Oct 03 15:01:43 crc kubenswrapper[4774]: I1003 15:01:43.065058 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5784cf869f-lvqzl" Oct 03 15:01:43 crc kubenswrapper[4774]: I1003 15:01:43.082321 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-987467b4f-dts4l" podStartSLOduration=3.082307579 podStartE2EDuration="3.082307579s" podCreationTimestamp="2025-10-03 15:01:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:01:43.081730274 +0000 UTC m=+1125.670933766" watchObservedRunningTime="2025-10-03 15:01:43.082307579 +0000 UTC m=+1125.671511021" Oct 03 15:01:43 crc kubenswrapper[4774]: I1003 15:01:43.110049 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 15:01:43 crc kubenswrapper[4774]: I1003 15:01:43.122514 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 03 15:01:43 crc kubenswrapper[4774]: I1003 15:01:43.127611 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 15:01:43 crc kubenswrapper[4774]: I1003 15:01:43.132230 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-rrp56" Oct 03 15:01:43 crc kubenswrapper[4774]: I1003 15:01:43.132584 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 03 15:01:43 crc kubenswrapper[4774]: I1003 15:01:43.133516 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 03 15:01:43 crc kubenswrapper[4774]: I1003 15:01:43.169815 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5784cf869f-lvqzl" podStartSLOduration=5.169789417 podStartE2EDuration="5.169789417s" podCreationTimestamp="2025-10-03 15:01:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:01:43.1094071 +0000 UTC m=+1125.698610552" watchObservedRunningTime="2025-10-03 15:01:43.169789417 +0000 UTC m=+1125.758992869" Oct 03 15:01:43 crc kubenswrapper[4774]: I1003 15:01:43.186527 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn9xb\" (UniqueName: \"kubernetes.io/projected/f975364c-ff2b-49bb-9e5e-c0fea0d15daa-kube-api-access-vn9xb\") pod \"openstackclient\" (UID: \"f975364c-ff2b-49bb-9e5e-c0fea0d15daa\") " pod="openstack/openstackclient" Oct 03 15:01:43 crc kubenswrapper[4774]: I1003 15:01:43.186631 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f975364c-ff2b-49bb-9e5e-c0fea0d15daa-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f975364c-ff2b-49bb-9e5e-c0fea0d15daa\") " pod="openstack/openstackclient" Oct 03 15:01:43 crc kubenswrapper[4774]: I1003 15:01:43.186836 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f975364c-ff2b-49bb-9e5e-c0fea0d15daa-openstack-config-secret\") pod \"openstackclient\" (UID: \"f975364c-ff2b-49bb-9e5e-c0fea0d15daa\") " pod="openstack/openstackclient" Oct 03 15:01:43 crc kubenswrapper[4774]: I1003 15:01:43.186929 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f975364c-ff2b-49bb-9e5e-c0fea0d15daa-openstack-config\") pod \"openstackclient\" (UID: \"f975364c-ff2b-49bb-9e5e-c0fea0d15daa\") " pod="openstack/openstackclient" Oct 03 15:01:43 crc kubenswrapper[4774]: I1003 15:01:43.237472 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 15:01:43 crc kubenswrapper[4774]: I1003 15:01:43.288618 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn9xb\" (UniqueName: \"kubernetes.io/projected/f975364c-ff2b-49bb-9e5e-c0fea0d15daa-kube-api-access-vn9xb\") pod \"openstackclient\" (UID: \"f975364c-ff2b-49bb-9e5e-c0fea0d15daa\") " pod="openstack/openstackclient" Oct 03 15:01:43 crc kubenswrapper[4774]: I1003 15:01:43.288677 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f975364c-ff2b-49bb-9e5e-c0fea0d15daa-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f975364c-ff2b-49bb-9e5e-c0fea0d15daa\") " pod="openstack/openstackclient" Oct 03 15:01:43 crc kubenswrapper[4774]: I1003 15:01:43.288753 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f975364c-ff2b-49bb-9e5e-c0fea0d15daa-openstack-config-secret\") pod \"openstackclient\" (UID: \"f975364c-ff2b-49bb-9e5e-c0fea0d15daa\") " pod="openstack/openstackclient" Oct 03 15:01:43 crc kubenswrapper[4774]: I1003 15:01:43.288804 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f975364c-ff2b-49bb-9e5e-c0fea0d15daa-openstack-config\") pod \"openstackclient\" (UID: \"f975364c-ff2b-49bb-9e5e-c0fea0d15daa\") " pod="openstack/openstackclient" Oct 03 15:01:43 crc kubenswrapper[4774]: I1003 15:01:43.289529 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f975364c-ff2b-49bb-9e5e-c0fea0d15daa-openstack-config\") pod \"openstackclient\" (UID: \"f975364c-ff2b-49bb-9e5e-c0fea0d15daa\") " pod="openstack/openstackclient" Oct 03 15:01:43 crc kubenswrapper[4774]: I1003 15:01:43.308566 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f975364c-ff2b-49bb-9e5e-c0fea0d15daa-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f975364c-ff2b-49bb-9e5e-c0fea0d15daa\") " pod="openstack/openstackclient" Oct 03 15:01:43 crc kubenswrapper[4774]: I1003 15:01:43.310196 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f975364c-ff2b-49bb-9e5e-c0fea0d15daa-openstack-config-secret\") pod \"openstackclient\" (UID: \"f975364c-ff2b-49bb-9e5e-c0fea0d15daa\") " pod="openstack/openstackclient" Oct 03 15:01:43 crc kubenswrapper[4774]: I1003 15:01:43.322147 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn9xb\" (UniqueName: \"kubernetes.io/projected/f975364c-ff2b-49bb-9e5e-c0fea0d15daa-kube-api-access-vn9xb\") pod \"openstackclient\" (UID: \"f975364c-ff2b-49bb-9e5e-c0fea0d15daa\") " pod="openstack/openstackclient" Oct 03 15:01:43 crc kubenswrapper[4774]: I1003 15:01:43.343309 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87919e0f-0800-44c9-863a-903c14884ae8" path="/var/lib/kubelet/pods/87919e0f-0800-44c9-863a-903c14884ae8/volumes" Oct 03 15:01:43 crc kubenswrapper[4774]: I1003 15:01:43.343937 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc3d42b5-126b-433f-b677-5fc8a2f3f1e2" path="/var/lib/kubelet/pods/dc3d42b5-126b-433f-b677-5fc8a2f3f1e2/volumes" Oct 03 15:01:43 crc kubenswrapper[4774]: I1003 15:01:43.501673 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 15:01:43 crc kubenswrapper[4774]: I1003 15:01:43.980340 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 15:01:44 crc kubenswrapper[4774]: I1003 15:01:44.091572 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c2605586-9dec-4e4f-a61d-7a93535cbaa2","Type":"ContainerStarted","Data":"b190b5c5d71355b63cd07124b2b4da8170217731a65f7207cd2cbb2160bc39be"} Oct 03 15:01:44 crc kubenswrapper[4774]: I1003 15:01:44.091614 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c2605586-9dec-4e4f-a61d-7a93535cbaa2","Type":"ContainerStarted","Data":"b070e4c533dac9c7f11de7635cf01a126d8554c74a628982546dfc5f3139b068"} Oct 03 15:01:44 crc kubenswrapper[4774]: I1003 15:01:44.092557 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f975364c-ff2b-49bb-9e5e-c0fea0d15daa","Type":"ContainerStarted","Data":"9efc484cacd98754a06007ee3442fbd7326d809b5d9e399b39ab5235769d0529"} Oct 03 15:01:44 crc kubenswrapper[4774]: I1003 15:01:44.094263 4774 generic.go:334] "Generic (PLEG): container finished" podID="d926f2ba-fdb2-4eb0-a509-5b9eb13ed626" containerID="975761272e9cd5eaae32f0a2f20a24e0dd1eb200f15996617c757a59662d1f70" exitCode=143 Oct 03 15:01:44 crc kubenswrapper[4774]: I1003 15:01:44.095192 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-78d8f8854b-zdr5d" event={"ID":"d926f2ba-fdb2-4eb0-a509-5b9eb13ed626","Type":"ContainerDied","Data":"975761272e9cd5eaae32f0a2f20a24e0dd1eb200f15996617c757a59662d1f70"} Oct 03 15:01:44 crc kubenswrapper[4774]: I1003 15:01:44.255634 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 03 15:01:44 crc kubenswrapper[4774]: I1003 15:01:44.688703 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 03 15:01:45 crc kubenswrapper[4774]: I1003 15:01:45.106178 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c2605586-9dec-4e4f-a61d-7a93535cbaa2","Type":"ContainerStarted","Data":"22368ab5dc17b1db819344858df8d65ed31bab996f5a30e20c24f4ee313d4a82"} Oct 03 15:01:45 crc kubenswrapper[4774]: I1003 15:01:45.106666 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 03 15:01:45 crc kubenswrapper[4774]: I1003 15:01:45.136139 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.136116914 podStartE2EDuration="3.136116914s" podCreationTimestamp="2025-10-03 15:01:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:01:45.129724876 +0000 UTC m=+1127.718928328" watchObservedRunningTime="2025-10-03 15:01:45.136116914 +0000 UTC m=+1127.725320366" Oct 03 15:01:45 crc kubenswrapper[4774]: I1003 15:01:45.180070 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 15:01:46 crc kubenswrapper[4774]: I1003 15:01:46.134767 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="9ab201e1-9ef3-485b-81f2-0b421dcc66cc" containerName="probe" containerID="cri-o://cec1c430a7ebe4634f855fdd18afe1dff2fbf97054719268f31d4a4bfdb5c9e9" gracePeriod=30 Oct 03 15:01:46 crc kubenswrapper[4774]: I1003 15:01:46.134729 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="9ab201e1-9ef3-485b-81f2-0b421dcc66cc" containerName="cinder-scheduler" containerID="cri-o://48840e45353d51bec85377ead7e54d4b8819f3ac710969f2fb5e297e370878d5" gracePeriod=30 Oct 03 15:01:46 crc kubenswrapper[4774]: I1003 15:01:46.656021 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-78d8f8854b-zdr5d" Oct 03 15:01:46 crc kubenswrapper[4774]: I1003 15:01:46.793216 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d926f2ba-fdb2-4eb0-a509-5b9eb13ed626-combined-ca-bundle\") pod \"d926f2ba-fdb2-4eb0-a509-5b9eb13ed626\" (UID: \"d926f2ba-fdb2-4eb0-a509-5b9eb13ed626\") " Oct 03 15:01:46 crc kubenswrapper[4774]: I1003 15:01:46.793305 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d926f2ba-fdb2-4eb0-a509-5b9eb13ed626-config-data\") pod \"d926f2ba-fdb2-4eb0-a509-5b9eb13ed626\" (UID: \"d926f2ba-fdb2-4eb0-a509-5b9eb13ed626\") " Oct 03 15:01:46 crc kubenswrapper[4774]: I1003 15:01:46.793350 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5n8gh\" (UniqueName: \"kubernetes.io/projected/d926f2ba-fdb2-4eb0-a509-5b9eb13ed626-kube-api-access-5n8gh\") pod \"d926f2ba-fdb2-4eb0-a509-5b9eb13ed626\" (UID: \"d926f2ba-fdb2-4eb0-a509-5b9eb13ed626\") " Oct 03 15:01:46 crc kubenswrapper[4774]: I1003 15:01:46.793442 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d926f2ba-fdb2-4eb0-a509-5b9eb13ed626-logs\") pod \"d926f2ba-fdb2-4eb0-a509-5b9eb13ed626\" (UID: \"d926f2ba-fdb2-4eb0-a509-5b9eb13ed626\") " Oct 03 15:01:46 crc kubenswrapper[4774]: I1003 15:01:46.793512 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d926f2ba-fdb2-4eb0-a509-5b9eb13ed626-config-data-custom\") pod \"d926f2ba-fdb2-4eb0-a509-5b9eb13ed626\" (UID: \"d926f2ba-fdb2-4eb0-a509-5b9eb13ed626\") " Oct 03 15:01:46 crc kubenswrapper[4774]: I1003 15:01:46.794149 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d926f2ba-fdb2-4eb0-a509-5b9eb13ed626-logs" (OuterVolumeSpecName: "logs") pod "d926f2ba-fdb2-4eb0-a509-5b9eb13ed626" (UID: "d926f2ba-fdb2-4eb0-a509-5b9eb13ed626"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:01:46 crc kubenswrapper[4774]: I1003 15:01:46.808811 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d926f2ba-fdb2-4eb0-a509-5b9eb13ed626-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d926f2ba-fdb2-4eb0-a509-5b9eb13ed626" (UID: "d926f2ba-fdb2-4eb0-a509-5b9eb13ed626"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:46 crc kubenswrapper[4774]: I1003 15:01:46.815533 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d926f2ba-fdb2-4eb0-a509-5b9eb13ed626-kube-api-access-5n8gh" (OuterVolumeSpecName: "kube-api-access-5n8gh") pod "d926f2ba-fdb2-4eb0-a509-5b9eb13ed626" (UID: "d926f2ba-fdb2-4eb0-a509-5b9eb13ed626"). InnerVolumeSpecName "kube-api-access-5n8gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:01:46 crc kubenswrapper[4774]: I1003 15:01:46.827803 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d926f2ba-fdb2-4eb0-a509-5b9eb13ed626-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d926f2ba-fdb2-4eb0-a509-5b9eb13ed626" (UID: "d926f2ba-fdb2-4eb0-a509-5b9eb13ed626"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:46 crc kubenswrapper[4774]: I1003 15:01:46.860573 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d926f2ba-fdb2-4eb0-a509-5b9eb13ed626-config-data" (OuterVolumeSpecName: "config-data") pod "d926f2ba-fdb2-4eb0-a509-5b9eb13ed626" (UID: "d926f2ba-fdb2-4eb0-a509-5b9eb13ed626"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:46 crc kubenswrapper[4774]: I1003 15:01:46.897732 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d926f2ba-fdb2-4eb0-a509-5b9eb13ed626-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:46 crc kubenswrapper[4774]: I1003 15:01:46.897773 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d926f2ba-fdb2-4eb0-a509-5b9eb13ed626-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:46 crc kubenswrapper[4774]: I1003 15:01:46.897787 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5n8gh\" (UniqueName: \"kubernetes.io/projected/d926f2ba-fdb2-4eb0-a509-5b9eb13ed626-kube-api-access-5n8gh\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:46 crc kubenswrapper[4774]: I1003 15:01:46.897866 4774 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d926f2ba-fdb2-4eb0-a509-5b9eb13ed626-logs\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:46 crc kubenswrapper[4774]: I1003 15:01:46.897878 4774 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d926f2ba-fdb2-4eb0-a509-5b9eb13ed626-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.145071 4774 generic.go:334] "Generic (PLEG): container finished" podID="9ab201e1-9ef3-485b-81f2-0b421dcc66cc" containerID="cec1c430a7ebe4634f855fdd18afe1dff2fbf97054719268f31d4a4bfdb5c9e9" exitCode=0 Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.145133 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9ab201e1-9ef3-485b-81f2-0b421dcc66cc","Type":"ContainerDied","Data":"cec1c430a7ebe4634f855fdd18afe1dff2fbf97054719268f31d4a4bfdb5c9e9"} Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.147132 4774 generic.go:334] "Generic (PLEG): container finished" podID="d926f2ba-fdb2-4eb0-a509-5b9eb13ed626" containerID="5625eea49c42c6a502d69e96000ba3039668090d6a81040b9b904c035bd8a32c" exitCode=0 Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.147155 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-78d8f8854b-zdr5d" event={"ID":"d926f2ba-fdb2-4eb0-a509-5b9eb13ed626","Type":"ContainerDied","Data":"5625eea49c42c6a502d69e96000ba3039668090d6a81040b9b904c035bd8a32c"} Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.147172 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-78d8f8854b-zdr5d" event={"ID":"d926f2ba-fdb2-4eb0-a509-5b9eb13ed626","Type":"ContainerDied","Data":"d488a05490e9808dd70e8c58e9e89fee94ce23e580e98baaaaa1f1c7b20919cd"} Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.147187 4774 scope.go:117] "RemoveContainer" containerID="5625eea49c42c6a502d69e96000ba3039668090d6a81040b9b904c035bd8a32c" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.147189 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-78d8f8854b-zdr5d" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.182411 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-78d8f8854b-zdr5d"] Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.189104 4774 scope.go:117] "RemoveContainer" containerID="975761272e9cd5eaae32f0a2f20a24e0dd1eb200f15996617c757a59662d1f70" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.193332 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-78d8f8854b-zdr5d"] Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.247833 4774 scope.go:117] "RemoveContainer" containerID="5625eea49c42c6a502d69e96000ba3039668090d6a81040b9b904c035bd8a32c" Oct 03 15:01:47 crc kubenswrapper[4774]: E1003 15:01:47.248242 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5625eea49c42c6a502d69e96000ba3039668090d6a81040b9b904c035bd8a32c\": container with ID starting with 5625eea49c42c6a502d69e96000ba3039668090d6a81040b9b904c035bd8a32c not found: ID does not exist" containerID="5625eea49c42c6a502d69e96000ba3039668090d6a81040b9b904c035bd8a32c" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.248277 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5625eea49c42c6a502d69e96000ba3039668090d6a81040b9b904c035bd8a32c"} err="failed to get container status \"5625eea49c42c6a502d69e96000ba3039668090d6a81040b9b904c035bd8a32c\": rpc error: code = NotFound desc = could not find container \"5625eea49c42c6a502d69e96000ba3039668090d6a81040b9b904c035bd8a32c\": container with ID starting with 5625eea49c42c6a502d69e96000ba3039668090d6a81040b9b904c035bd8a32c not found: ID does not exist" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.248298 4774 scope.go:117] "RemoveContainer" containerID="975761272e9cd5eaae32f0a2f20a24e0dd1eb200f15996617c757a59662d1f70" Oct 03 15:01:47 crc kubenswrapper[4774]: E1003 15:01:47.248667 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"975761272e9cd5eaae32f0a2f20a24e0dd1eb200f15996617c757a59662d1f70\": container with ID starting with 975761272e9cd5eaae32f0a2f20a24e0dd1eb200f15996617c757a59662d1f70 not found: ID does not exist" containerID="975761272e9cd5eaae32f0a2f20a24e0dd1eb200f15996617c757a59662d1f70" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.248715 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"975761272e9cd5eaae32f0a2f20a24e0dd1eb200f15996617c757a59662d1f70"} err="failed to get container status \"975761272e9cd5eaae32f0a2f20a24e0dd1eb200f15996617c757a59662d1f70\": rpc error: code = NotFound desc = could not find container \"975761272e9cd5eaae32f0a2f20a24e0dd1eb200f15996617c757a59662d1f70\": container with ID starting with 975761272e9cd5eaae32f0a2f20a24e0dd1eb200f15996617c757a59662d1f70 not found: ID does not exist" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.311995 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d926f2ba-fdb2-4eb0-a509-5b9eb13ed626" path="/var/lib/kubelet/pods/d926f2ba-fdb2-4eb0-a509-5b9eb13ed626/volumes" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.726227 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5c49d658df-5r5zg"] Oct 03 15:01:47 crc kubenswrapper[4774]: E1003 15:01:47.726722 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d926f2ba-fdb2-4eb0-a509-5b9eb13ed626" containerName="barbican-api" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.726745 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="d926f2ba-fdb2-4eb0-a509-5b9eb13ed626" containerName="barbican-api" Oct 03 15:01:47 crc kubenswrapper[4774]: E1003 15:01:47.726766 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d926f2ba-fdb2-4eb0-a509-5b9eb13ed626" containerName="barbican-api-log" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.726774 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="d926f2ba-fdb2-4eb0-a509-5b9eb13ed626" containerName="barbican-api-log" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.726992 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="d926f2ba-fdb2-4eb0-a509-5b9eb13ed626" containerName="barbican-api" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.727015 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="d926f2ba-fdb2-4eb0-a509-5b9eb13ed626" containerName="barbican-api-log" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.728220 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5c49d658df-5r5zg" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.734452 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.734736 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.740270 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.746390 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5c49d658df-5r5zg"] Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.814358 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f82147b-63cd-44bc-8950-bf87fa407688-internal-tls-certs\") pod \"swift-proxy-5c49d658df-5r5zg\" (UID: \"4f82147b-63cd-44bc-8950-bf87fa407688\") " pod="openstack/swift-proxy-5c49d658df-5r5zg" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.814448 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f82147b-63cd-44bc-8950-bf87fa407688-public-tls-certs\") pod \"swift-proxy-5c49d658df-5r5zg\" (UID: \"4f82147b-63cd-44bc-8950-bf87fa407688\") " pod="openstack/swift-proxy-5c49d658df-5r5zg" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.814485 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4f82147b-63cd-44bc-8950-bf87fa407688-etc-swift\") pod \"swift-proxy-5c49d658df-5r5zg\" (UID: \"4f82147b-63cd-44bc-8950-bf87fa407688\") " pod="openstack/swift-proxy-5c49d658df-5r5zg" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.814563 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f82147b-63cd-44bc-8950-bf87fa407688-config-data\") pod \"swift-proxy-5c49d658df-5r5zg\" (UID: \"4f82147b-63cd-44bc-8950-bf87fa407688\") " pod="openstack/swift-proxy-5c49d658df-5r5zg" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.814756 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m85x4\" (UniqueName: \"kubernetes.io/projected/4f82147b-63cd-44bc-8950-bf87fa407688-kube-api-access-m85x4\") pod \"swift-proxy-5c49d658df-5r5zg\" (UID: \"4f82147b-63cd-44bc-8950-bf87fa407688\") " pod="openstack/swift-proxy-5c49d658df-5r5zg" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.814871 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f82147b-63cd-44bc-8950-bf87fa407688-combined-ca-bundle\") pod \"swift-proxy-5c49d658df-5r5zg\" (UID: \"4f82147b-63cd-44bc-8950-bf87fa407688\") " pod="openstack/swift-proxy-5c49d658df-5r5zg" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.814916 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f82147b-63cd-44bc-8950-bf87fa407688-log-httpd\") pod \"swift-proxy-5c49d658df-5r5zg\" (UID: \"4f82147b-63cd-44bc-8950-bf87fa407688\") " pod="openstack/swift-proxy-5c49d658df-5r5zg" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.814935 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f82147b-63cd-44bc-8950-bf87fa407688-run-httpd\") pod \"swift-proxy-5c49d658df-5r5zg\" (UID: \"4f82147b-63cd-44bc-8950-bf87fa407688\") " pod="openstack/swift-proxy-5c49d658df-5r5zg" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.916413 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4f82147b-63cd-44bc-8950-bf87fa407688-etc-swift\") pod \"swift-proxy-5c49d658df-5r5zg\" (UID: \"4f82147b-63cd-44bc-8950-bf87fa407688\") " pod="openstack/swift-proxy-5c49d658df-5r5zg" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.916759 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f82147b-63cd-44bc-8950-bf87fa407688-config-data\") pod \"swift-proxy-5c49d658df-5r5zg\" (UID: \"4f82147b-63cd-44bc-8950-bf87fa407688\") " pod="openstack/swift-proxy-5c49d658df-5r5zg" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.916920 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m85x4\" (UniqueName: \"kubernetes.io/projected/4f82147b-63cd-44bc-8950-bf87fa407688-kube-api-access-m85x4\") pod \"swift-proxy-5c49d658df-5r5zg\" (UID: \"4f82147b-63cd-44bc-8950-bf87fa407688\") " pod="openstack/swift-proxy-5c49d658df-5r5zg" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.917021 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f82147b-63cd-44bc-8950-bf87fa407688-combined-ca-bundle\") pod \"swift-proxy-5c49d658df-5r5zg\" (UID: \"4f82147b-63cd-44bc-8950-bf87fa407688\") " pod="openstack/swift-proxy-5c49d658df-5r5zg" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.917095 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f82147b-63cd-44bc-8950-bf87fa407688-run-httpd\") pod \"swift-proxy-5c49d658df-5r5zg\" (UID: \"4f82147b-63cd-44bc-8950-bf87fa407688\") " pod="openstack/swift-proxy-5c49d658df-5r5zg" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.917186 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f82147b-63cd-44bc-8950-bf87fa407688-log-httpd\") pod \"swift-proxy-5c49d658df-5r5zg\" (UID: \"4f82147b-63cd-44bc-8950-bf87fa407688\") " pod="openstack/swift-proxy-5c49d658df-5r5zg" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.917315 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f82147b-63cd-44bc-8950-bf87fa407688-internal-tls-certs\") pod \"swift-proxy-5c49d658df-5r5zg\" (UID: \"4f82147b-63cd-44bc-8950-bf87fa407688\") " pod="openstack/swift-proxy-5c49d658df-5r5zg" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.917427 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f82147b-63cd-44bc-8950-bf87fa407688-public-tls-certs\") pod \"swift-proxy-5c49d658df-5r5zg\" (UID: \"4f82147b-63cd-44bc-8950-bf87fa407688\") " pod="openstack/swift-proxy-5c49d658df-5r5zg" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.917613 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f82147b-63cd-44bc-8950-bf87fa407688-run-httpd\") pod \"swift-proxy-5c49d658df-5r5zg\" (UID: \"4f82147b-63cd-44bc-8950-bf87fa407688\") " pod="openstack/swift-proxy-5c49d658df-5r5zg" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.917713 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f82147b-63cd-44bc-8950-bf87fa407688-log-httpd\") pod \"swift-proxy-5c49d658df-5r5zg\" (UID: \"4f82147b-63cd-44bc-8950-bf87fa407688\") " pod="openstack/swift-proxy-5c49d658df-5r5zg" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.922640 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f82147b-63cd-44bc-8950-bf87fa407688-combined-ca-bundle\") pod \"swift-proxy-5c49d658df-5r5zg\" (UID: \"4f82147b-63cd-44bc-8950-bf87fa407688\") " pod="openstack/swift-proxy-5c49d658df-5r5zg" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.925330 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f82147b-63cd-44bc-8950-bf87fa407688-config-data\") pod \"swift-proxy-5c49d658df-5r5zg\" (UID: \"4f82147b-63cd-44bc-8950-bf87fa407688\") " pod="openstack/swift-proxy-5c49d658df-5r5zg" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.925804 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f82147b-63cd-44bc-8950-bf87fa407688-public-tls-certs\") pod \"swift-proxy-5c49d658df-5r5zg\" (UID: \"4f82147b-63cd-44bc-8950-bf87fa407688\") " pod="openstack/swift-proxy-5c49d658df-5r5zg" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.925994 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f82147b-63cd-44bc-8950-bf87fa407688-internal-tls-certs\") pod \"swift-proxy-5c49d658df-5r5zg\" (UID: \"4f82147b-63cd-44bc-8950-bf87fa407688\") " pod="openstack/swift-proxy-5c49d658df-5r5zg" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.926202 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4f82147b-63cd-44bc-8950-bf87fa407688-etc-swift\") pod \"swift-proxy-5c49d658df-5r5zg\" (UID: \"4f82147b-63cd-44bc-8950-bf87fa407688\") " pod="openstack/swift-proxy-5c49d658df-5r5zg" Oct 03 15:01:47 crc kubenswrapper[4774]: I1003 15:01:47.940303 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m85x4\" (UniqueName: \"kubernetes.io/projected/4f82147b-63cd-44bc-8950-bf87fa407688-kube-api-access-m85x4\") pod \"swift-proxy-5c49d658df-5r5zg\" (UID: \"4f82147b-63cd-44bc-8950-bf87fa407688\") " pod="openstack/swift-proxy-5c49d658df-5r5zg" Oct 03 15:01:48 crc kubenswrapper[4774]: I1003 15:01:48.052709 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5c49d658df-5r5zg" Oct 03 15:01:48 crc kubenswrapper[4774]: I1003 15:01:48.506595 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5784cf869f-lvqzl" Oct 03 15:01:48 crc kubenswrapper[4774]: I1003 15:01:48.576249 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-8gc5c"] Oct 03 15:01:48 crc kubenswrapper[4774]: I1003 15:01:48.632178 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5c49d658df-5r5zg"] Oct 03 15:01:48 crc kubenswrapper[4774]: I1003 15:01:48.975281 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:01:48 crc kubenswrapper[4774]: I1003 15:01:48.976941 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5c51760f-5a26-453e-b578-3bc16d784a4a" containerName="proxy-httpd" containerID="cri-o://929567c8817b96e7c982eb4b05370573427dbb7f549f86e306e63a28d248bac1" gracePeriod=30 Oct 03 15:01:48 crc kubenswrapper[4774]: I1003 15:01:48.977279 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5c51760f-5a26-453e-b578-3bc16d784a4a" containerName="sg-core" containerID="cri-o://558706c0c65d7eb0e5d8e36f2210d186dcc18a506673e23cbe79dcc06a747ecf" gracePeriod=30 Oct 03 15:01:48 crc kubenswrapper[4774]: I1003 15:01:48.977391 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5c51760f-5a26-453e-b578-3bc16d784a4a" containerName="ceilometer-notification-agent" containerID="cri-o://858c84834eaf180af71bc0ede5599a2221a672db851281feecbd46e4d4d042c7" gracePeriod=30 Oct 03 15:01:49 crc kubenswrapper[4774]: I1003 15:01:48.976325 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5c51760f-5a26-453e-b578-3bc16d784a4a" containerName="ceilometer-central-agent" containerID="cri-o://2ab27bcf9d1600d0ab6f3b362911893a98a6cf7d2181a8b461f383fbca930634" gracePeriod=30 Oct 03 15:01:49 crc kubenswrapper[4774]: I1003 15:01:49.087938 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="5c51760f-5a26-453e-b578-3bc16d784a4a" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.155:3000/\": read tcp 10.217.0.2:51836->10.217.0.155:3000: read: connection reset by peer" Oct 03 15:01:49 crc kubenswrapper[4774]: I1003 15:01:49.182990 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c49d658df-5r5zg" event={"ID":"4f82147b-63cd-44bc-8950-bf87fa407688","Type":"ContainerStarted","Data":"8c8129c31710c79314e74c7340999dddcc6f03b977d0d72d821a30a03952f13f"} Oct 03 15:01:49 crc kubenswrapper[4774]: I1003 15:01:49.183237 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c49d658df-5r5zg" event={"ID":"4f82147b-63cd-44bc-8950-bf87fa407688","Type":"ContainerStarted","Data":"aa70cdfa0fa7eb427d6af0d9d226ec0fa34a426fe5f9b61caab296ec99d1af7f"} Oct 03 15:01:49 crc kubenswrapper[4774]: I1003 15:01:49.188924 4774 generic.go:334] "Generic (PLEG): container finished" podID="5c51760f-5a26-453e-b578-3bc16d784a4a" containerID="558706c0c65d7eb0e5d8e36f2210d186dcc18a506673e23cbe79dcc06a747ecf" exitCode=2 Oct 03 15:01:49 crc kubenswrapper[4774]: I1003 15:01:49.188997 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c51760f-5a26-453e-b578-3bc16d784a4a","Type":"ContainerDied","Data":"558706c0c65d7eb0e5d8e36f2210d186dcc18a506673e23cbe79dcc06a747ecf"} Oct 03 15:01:49 crc kubenswrapper[4774]: I1003 15:01:49.189204 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8b5c85b87-8gc5c" podUID="3308cda8-c038-4fbc-91ad-824ce2c1d85c" containerName="dnsmasq-dns" containerID="cri-o://d3b421bd6d3316b3283e7c33022a26900e008c55af8794448be7869d582c2540" gracePeriod=10 Oct 03 15:01:50 crc kubenswrapper[4774]: I1003 15:01:50.214392 4774 generic.go:334] "Generic (PLEG): container finished" podID="5c51760f-5a26-453e-b578-3bc16d784a4a" containerID="929567c8817b96e7c982eb4b05370573427dbb7f549f86e306e63a28d248bac1" exitCode=0 Oct 03 15:01:50 crc kubenswrapper[4774]: I1003 15:01:50.214790 4774 generic.go:334] "Generic (PLEG): container finished" podID="5c51760f-5a26-453e-b578-3bc16d784a4a" containerID="858c84834eaf180af71bc0ede5599a2221a672db851281feecbd46e4d4d042c7" exitCode=0 Oct 03 15:01:50 crc kubenswrapper[4774]: I1003 15:01:50.214806 4774 generic.go:334] "Generic (PLEG): container finished" podID="5c51760f-5a26-453e-b578-3bc16d784a4a" containerID="2ab27bcf9d1600d0ab6f3b362911893a98a6cf7d2181a8b461f383fbca930634" exitCode=0 Oct 03 15:01:50 crc kubenswrapper[4774]: I1003 15:01:50.214451 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c51760f-5a26-453e-b578-3bc16d784a4a","Type":"ContainerDied","Data":"929567c8817b96e7c982eb4b05370573427dbb7f549f86e306e63a28d248bac1"} Oct 03 15:01:50 crc kubenswrapper[4774]: I1003 15:01:50.214899 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c51760f-5a26-453e-b578-3bc16d784a4a","Type":"ContainerDied","Data":"858c84834eaf180af71bc0ede5599a2221a672db851281feecbd46e4d4d042c7"} Oct 03 15:01:50 crc kubenswrapper[4774]: I1003 15:01:50.214920 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c51760f-5a26-453e-b578-3bc16d784a4a","Type":"ContainerDied","Data":"2ab27bcf9d1600d0ab6f3b362911893a98a6cf7d2181a8b461f383fbca930634"} Oct 03 15:01:50 crc kubenswrapper[4774]: I1003 15:01:50.220855 4774 generic.go:334] "Generic (PLEG): container finished" podID="3308cda8-c038-4fbc-91ad-824ce2c1d85c" containerID="d3b421bd6d3316b3283e7c33022a26900e008c55af8794448be7869d582c2540" exitCode=0 Oct 03 15:01:50 crc kubenswrapper[4774]: I1003 15:01:50.220956 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-8gc5c" event={"ID":"3308cda8-c038-4fbc-91ad-824ce2c1d85c","Type":"ContainerDied","Data":"d3b421bd6d3316b3283e7c33022a26900e008c55af8794448be7869d582c2540"} Oct 03 15:01:50 crc kubenswrapper[4774]: I1003 15:01:50.235547 4774 generic.go:334] "Generic (PLEG): container finished" podID="9ab201e1-9ef3-485b-81f2-0b421dcc66cc" containerID="48840e45353d51bec85377ead7e54d4b8819f3ac710969f2fb5e297e370878d5" exitCode=0 Oct 03 15:01:50 crc kubenswrapper[4774]: I1003 15:01:50.235632 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9ab201e1-9ef3-485b-81f2-0b421dcc66cc","Type":"ContainerDied","Data":"48840e45353d51bec85377ead7e54d4b8819f3ac710969f2fb5e297e370878d5"} Oct 03 15:01:50 crc kubenswrapper[4774]: I1003 15:01:50.238820 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c49d658df-5r5zg" event={"ID":"4f82147b-63cd-44bc-8950-bf87fa407688","Type":"ContainerStarted","Data":"bf775c0a11cc157c5b56021ef74ad32fa25b14cc8448e7ed29c6c78cf10a0338"} Oct 03 15:01:50 crc kubenswrapper[4774]: I1003 15:01:50.240582 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5c49d658df-5r5zg" Oct 03 15:01:50 crc kubenswrapper[4774]: I1003 15:01:50.240618 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5c49d658df-5r5zg" Oct 03 15:01:50 crc kubenswrapper[4774]: I1003 15:01:50.263888 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5c49d658df-5r5zg" podStartSLOduration=3.263861638 podStartE2EDuration="3.263861638s" podCreationTimestamp="2025-10-03 15:01:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:01:50.256777102 +0000 UTC m=+1132.845980554" watchObservedRunningTime="2025-10-03 15:01:50.263861638 +0000 UTC m=+1132.853065090" Oct 03 15:01:51 crc kubenswrapper[4774]: I1003 15:01:51.612454 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-bc5bdf456-xt2x4" podUID="871f7d16-54b6-4aa9-8e99-00a888d41f70" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Oct 03 15:01:51 crc kubenswrapper[4774]: I1003 15:01:51.612932 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-bc5bdf456-xt2x4" Oct 03 15:01:53 crc kubenswrapper[4774]: I1003 15:01:53.067282 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5c49d658df-5r5zg" Oct 03 15:01:53 crc kubenswrapper[4774]: I1003 15:01:53.824307 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="5c51760f-5a26-453e-b578-3bc16d784a4a" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.155:3000/\": dial tcp 10.217.0.155:3000: connect: connection refused" Oct 03 15:01:54 crc kubenswrapper[4774]: I1003 15:01:54.189736 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-tggq5"] Oct 03 15:01:54 crc kubenswrapper[4774]: I1003 15:01:54.191189 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tggq5" Oct 03 15:01:54 crc kubenswrapper[4774]: I1003 15:01:54.202141 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-tggq5"] Oct 03 15:01:54 crc kubenswrapper[4774]: I1003 15:01:54.263110 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76f46\" (UniqueName: \"kubernetes.io/projected/367d8aee-34b7-485e-848a-3e267afa8fd6-kube-api-access-76f46\") pod \"nova-api-db-create-tggq5\" (UID: \"367d8aee-34b7-485e-848a-3e267afa8fd6\") " pod="openstack/nova-api-db-create-tggq5" Oct 03 15:01:54 crc kubenswrapper[4774]: I1003 15:01:54.275996 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-8qxhd"] Oct 03 15:01:54 crc kubenswrapper[4774]: I1003 15:01:54.277251 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-8qxhd" Oct 03 15:01:54 crc kubenswrapper[4774]: I1003 15:01:54.291573 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-8qxhd"] Oct 03 15:01:54 crc kubenswrapper[4774]: I1003 15:01:54.366351 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xltn2\" (UniqueName: \"kubernetes.io/projected/6dde1c72-df79-4066-837c-0e318b636b73-kube-api-access-xltn2\") pod \"nova-cell0-db-create-8qxhd\" (UID: \"6dde1c72-df79-4066-837c-0e318b636b73\") " pod="openstack/nova-cell0-db-create-8qxhd" Oct 03 15:01:54 crc kubenswrapper[4774]: I1003 15:01:54.366509 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76f46\" (UniqueName: \"kubernetes.io/projected/367d8aee-34b7-485e-848a-3e267afa8fd6-kube-api-access-76f46\") pod \"nova-api-db-create-tggq5\" (UID: \"367d8aee-34b7-485e-848a-3e267afa8fd6\") " pod="openstack/nova-api-db-create-tggq5" Oct 03 15:01:54 crc kubenswrapper[4774]: I1003 15:01:54.375322 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-mgszf"] Oct 03 15:01:54 crc kubenswrapper[4774]: I1003 15:01:54.376389 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mgszf" Oct 03 15:01:54 crc kubenswrapper[4774]: I1003 15:01:54.387183 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-mgszf"] Oct 03 15:01:54 crc kubenswrapper[4774]: I1003 15:01:54.419717 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76f46\" (UniqueName: \"kubernetes.io/projected/367d8aee-34b7-485e-848a-3e267afa8fd6-kube-api-access-76f46\") pod \"nova-api-db-create-tggq5\" (UID: \"367d8aee-34b7-485e-848a-3e267afa8fd6\") " pod="openstack/nova-api-db-create-tggq5" Oct 03 15:01:54 crc kubenswrapper[4774]: I1003 15:01:54.473887 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xltn2\" (UniqueName: \"kubernetes.io/projected/6dde1c72-df79-4066-837c-0e318b636b73-kube-api-access-xltn2\") pod \"nova-cell0-db-create-8qxhd\" (UID: \"6dde1c72-df79-4066-837c-0e318b636b73\") " pod="openstack/nova-cell0-db-create-8qxhd" Oct 03 15:01:54 crc kubenswrapper[4774]: I1003 15:01:54.473973 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpxj5\" (UniqueName: \"kubernetes.io/projected/38b934ad-cf29-40d1-993c-8dcc6b8c0b8c-kube-api-access-dpxj5\") pod \"nova-cell1-db-create-mgszf\" (UID: \"38b934ad-cf29-40d1-993c-8dcc6b8c0b8c\") " pod="openstack/nova-cell1-db-create-mgszf" Oct 03 15:01:54 crc kubenswrapper[4774]: I1003 15:01:54.492169 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xltn2\" (UniqueName: \"kubernetes.io/projected/6dde1c72-df79-4066-837c-0e318b636b73-kube-api-access-xltn2\") pod \"nova-cell0-db-create-8qxhd\" (UID: \"6dde1c72-df79-4066-837c-0e318b636b73\") " pod="openstack/nova-cell0-db-create-8qxhd" Oct 03 15:01:54 crc kubenswrapper[4774]: I1003 15:01:54.523300 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tggq5" Oct 03 15:01:54 crc kubenswrapper[4774]: I1003 15:01:54.575216 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpxj5\" (UniqueName: \"kubernetes.io/projected/38b934ad-cf29-40d1-993c-8dcc6b8c0b8c-kube-api-access-dpxj5\") pod \"nova-cell1-db-create-mgszf\" (UID: \"38b934ad-cf29-40d1-993c-8dcc6b8c0b8c\") " pod="openstack/nova-cell1-db-create-mgszf" Oct 03 15:01:54 crc kubenswrapper[4774]: I1003 15:01:54.595940 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-8qxhd" Oct 03 15:01:54 crc kubenswrapper[4774]: I1003 15:01:54.599398 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpxj5\" (UniqueName: \"kubernetes.io/projected/38b934ad-cf29-40d1-993c-8dcc6b8c0b8c-kube-api-access-dpxj5\") pod \"nova-cell1-db-create-mgszf\" (UID: \"38b934ad-cf29-40d1-993c-8dcc6b8c0b8c\") " pod="openstack/nova-cell1-db-create-mgszf" Oct 03 15:01:54 crc kubenswrapper[4774]: I1003 15:01:54.692419 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mgszf" Oct 03 15:01:55 crc kubenswrapper[4774]: I1003 15:01:55.088209 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 03 15:01:55 crc kubenswrapper[4774]: I1003 15:01:55.530051 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 15:01:55 crc kubenswrapper[4774]: I1003 15:01:55.530322 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5a4f000d-fdd6-46ec-b6b4-55a574ca801d" containerName="glance-log" containerID="cri-o://b463abc99cf516a57bf6debf4dde0935b961cbb0865fa8230aa54cdca3c79d65" gracePeriod=30 Oct 03 15:01:55 crc kubenswrapper[4774]: I1003 15:01:55.530771 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5a4f000d-fdd6-46ec-b6b4-55a574ca801d" containerName="glance-httpd" containerID="cri-o://6dadf5c7a6046e152b21be98bc4a11f540148be2f9f8dbd1628a8d83d1d43ea2" gracePeriod=30 Oct 03 15:01:55 crc kubenswrapper[4774]: I1003 15:01:55.739004 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8b5c85b87-8gc5c" podUID="3308cda8-c038-4fbc-91ad-824ce2c1d85c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.138:5353: i/o timeout" Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.373867 4774 generic.go:334] "Generic (PLEG): container finished" podID="871f7d16-54b6-4aa9-8e99-00a888d41f70" containerID="4c3807337ee7f2b8e44c4122821db338ac04f20ba3e7962936f507b835bfe6b9" exitCode=137 Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.374215 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bc5bdf456-xt2x4" event={"ID":"871f7d16-54b6-4aa9-8e99-00a888d41f70","Type":"ContainerDied","Data":"4c3807337ee7f2b8e44c4122821db338ac04f20ba3e7962936f507b835bfe6b9"} Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.468787 4774 generic.go:334] "Generic (PLEG): container finished" podID="5a4f000d-fdd6-46ec-b6b4-55a574ca801d" containerID="b463abc99cf516a57bf6debf4dde0935b961cbb0865fa8230aa54cdca3c79d65" exitCode=143 Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.469137 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5a4f000d-fdd6-46ec-b6b4-55a574ca801d","Type":"ContainerDied","Data":"b463abc99cf516a57bf6debf4dde0935b961cbb0865fa8230aa54cdca3c79d65"} Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.485539 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-8gc5c" Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.619546 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3308cda8-c038-4fbc-91ad-824ce2c1d85c-ovsdbserver-sb\") pod \"3308cda8-c038-4fbc-91ad-824ce2c1d85c\" (UID: \"3308cda8-c038-4fbc-91ad-824ce2c1d85c\") " Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.620671 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3308cda8-c038-4fbc-91ad-824ce2c1d85c-ovsdbserver-nb\") pod \"3308cda8-c038-4fbc-91ad-824ce2c1d85c\" (UID: \"3308cda8-c038-4fbc-91ad-824ce2c1d85c\") " Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.620832 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3308cda8-c038-4fbc-91ad-824ce2c1d85c-dns-swift-storage-0\") pod \"3308cda8-c038-4fbc-91ad-824ce2c1d85c\" (UID: \"3308cda8-c038-4fbc-91ad-824ce2c1d85c\") " Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.620864 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cbgs\" (UniqueName: \"kubernetes.io/projected/3308cda8-c038-4fbc-91ad-824ce2c1d85c-kube-api-access-2cbgs\") pod \"3308cda8-c038-4fbc-91ad-824ce2c1d85c\" (UID: \"3308cda8-c038-4fbc-91ad-824ce2c1d85c\") " Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.620942 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3308cda8-c038-4fbc-91ad-824ce2c1d85c-config\") pod \"3308cda8-c038-4fbc-91ad-824ce2c1d85c\" (UID: \"3308cda8-c038-4fbc-91ad-824ce2c1d85c\") " Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.620975 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3308cda8-c038-4fbc-91ad-824ce2c1d85c-dns-svc\") pod \"3308cda8-c038-4fbc-91ad-824ce2c1d85c\" (UID: \"3308cda8-c038-4fbc-91ad-824ce2c1d85c\") " Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.629804 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3308cda8-c038-4fbc-91ad-824ce2c1d85c-kube-api-access-2cbgs" (OuterVolumeSpecName: "kube-api-access-2cbgs") pod "3308cda8-c038-4fbc-91ad-824ce2c1d85c" (UID: "3308cda8-c038-4fbc-91ad-824ce2c1d85c"). InnerVolumeSpecName "kube-api-access-2cbgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.723440 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cbgs\" (UniqueName: \"kubernetes.io/projected/3308cda8-c038-4fbc-91ad-824ce2c1d85c-kube-api-access-2cbgs\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.734150 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bc5bdf456-xt2x4" Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.741768 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3308cda8-c038-4fbc-91ad-824ce2c1d85c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3308cda8-c038-4fbc-91ad-824ce2c1d85c" (UID: "3308cda8-c038-4fbc-91ad-824ce2c1d85c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.749248 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3308cda8-c038-4fbc-91ad-824ce2c1d85c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3308cda8-c038-4fbc-91ad-824ce2c1d85c" (UID: "3308cda8-c038-4fbc-91ad-824ce2c1d85c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.761845 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3308cda8-c038-4fbc-91ad-824ce2c1d85c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3308cda8-c038-4fbc-91ad-824ce2c1d85c" (UID: "3308cda8-c038-4fbc-91ad-824ce2c1d85c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.766972 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3308cda8-c038-4fbc-91ad-824ce2c1d85c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3308cda8-c038-4fbc-91ad-824ce2c1d85c" (UID: "3308cda8-c038-4fbc-91ad-824ce2c1d85c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.774672 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3308cda8-c038-4fbc-91ad-824ce2c1d85c-config" (OuterVolumeSpecName: "config") pod "3308cda8-c038-4fbc-91ad-824ce2c1d85c" (UID: "3308cda8-c038-4fbc-91ad-824ce2c1d85c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.824227 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjcr2\" (UniqueName: \"kubernetes.io/projected/871f7d16-54b6-4aa9-8e99-00a888d41f70-kube-api-access-xjcr2\") pod \"871f7d16-54b6-4aa9-8e99-00a888d41f70\" (UID: \"871f7d16-54b6-4aa9-8e99-00a888d41f70\") " Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.824352 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/871f7d16-54b6-4aa9-8e99-00a888d41f70-config-data\") pod \"871f7d16-54b6-4aa9-8e99-00a888d41f70\" (UID: \"871f7d16-54b6-4aa9-8e99-00a888d41f70\") " Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.824524 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/871f7d16-54b6-4aa9-8e99-00a888d41f70-horizon-secret-key\") pod \"871f7d16-54b6-4aa9-8e99-00a888d41f70\" (UID: \"871f7d16-54b6-4aa9-8e99-00a888d41f70\") " Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.824562 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/871f7d16-54b6-4aa9-8e99-00a888d41f70-logs\") pod \"871f7d16-54b6-4aa9-8e99-00a888d41f70\" (UID: \"871f7d16-54b6-4aa9-8e99-00a888d41f70\") " Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.824611 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/871f7d16-54b6-4aa9-8e99-00a888d41f70-scripts\") pod \"871f7d16-54b6-4aa9-8e99-00a888d41f70\" (UID: \"871f7d16-54b6-4aa9-8e99-00a888d41f70\") " Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.824674 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/871f7d16-54b6-4aa9-8e99-00a888d41f70-horizon-tls-certs\") pod \"871f7d16-54b6-4aa9-8e99-00a888d41f70\" (UID: \"871f7d16-54b6-4aa9-8e99-00a888d41f70\") " Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.824714 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/871f7d16-54b6-4aa9-8e99-00a888d41f70-combined-ca-bundle\") pod \"871f7d16-54b6-4aa9-8e99-00a888d41f70\" (UID: \"871f7d16-54b6-4aa9-8e99-00a888d41f70\") " Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.825204 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3308cda8-c038-4fbc-91ad-824ce2c1d85c-config\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.825221 4774 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3308cda8-c038-4fbc-91ad-824ce2c1d85c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.825231 4774 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3308cda8-c038-4fbc-91ad-824ce2c1d85c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.825244 4774 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3308cda8-c038-4fbc-91ad-824ce2c1d85c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.825254 4774 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3308cda8-c038-4fbc-91ad-824ce2c1d85c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.825218 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/871f7d16-54b6-4aa9-8e99-00a888d41f70-logs" (OuterVolumeSpecName: "logs") pod "871f7d16-54b6-4aa9-8e99-00a888d41f70" (UID: "871f7d16-54b6-4aa9-8e99-00a888d41f70"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.828801 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/871f7d16-54b6-4aa9-8e99-00a888d41f70-kube-api-access-xjcr2" (OuterVolumeSpecName: "kube-api-access-xjcr2") pod "871f7d16-54b6-4aa9-8e99-00a888d41f70" (UID: "871f7d16-54b6-4aa9-8e99-00a888d41f70"). InnerVolumeSpecName "kube-api-access-xjcr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.840163 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/871f7d16-54b6-4aa9-8e99-00a888d41f70-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "871f7d16-54b6-4aa9-8e99-00a888d41f70" (UID: "871f7d16-54b6-4aa9-8e99-00a888d41f70"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.849781 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/871f7d16-54b6-4aa9-8e99-00a888d41f70-config-data" (OuterVolumeSpecName: "config-data") pod "871f7d16-54b6-4aa9-8e99-00a888d41f70" (UID: "871f7d16-54b6-4aa9-8e99-00a888d41f70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.882233 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/871f7d16-54b6-4aa9-8e99-00a888d41f70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "871f7d16-54b6-4aa9-8e99-00a888d41f70" (UID: "871f7d16-54b6-4aa9-8e99-00a888d41f70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.882345 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/871f7d16-54b6-4aa9-8e99-00a888d41f70-scripts" (OuterVolumeSpecName: "scripts") pod "871f7d16-54b6-4aa9-8e99-00a888d41f70" (UID: "871f7d16-54b6-4aa9-8e99-00a888d41f70"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.904787 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/871f7d16-54b6-4aa9-8e99-00a888d41f70-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "871f7d16-54b6-4aa9-8e99-00a888d41f70" (UID: "871f7d16-54b6-4aa9-8e99-00a888d41f70"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.929649 4774 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/871f7d16-54b6-4aa9-8e99-00a888d41f70-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.929785 4774 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/871f7d16-54b6-4aa9-8e99-00a888d41f70-logs\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.932465 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/871f7d16-54b6-4aa9-8e99-00a888d41f70-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.932586 4774 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/871f7d16-54b6-4aa9-8e99-00a888d41f70-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.932688 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/871f7d16-54b6-4aa9-8e99-00a888d41f70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.932805 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjcr2\" (UniqueName: \"kubernetes.io/projected/871f7d16-54b6-4aa9-8e99-00a888d41f70-kube-api-access-xjcr2\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:56 crc kubenswrapper[4774]: I1003 15:01:56.932910 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/871f7d16-54b6-4aa9-8e99-00a888d41f70-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.007139 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.107219 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.135262 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-combined-ca-bundle\") pod \"9ab201e1-9ef3-485b-81f2-0b421dcc66cc\" (UID: \"9ab201e1-9ef3-485b-81f2-0b421dcc66cc\") " Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.135333 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8tms\" (UniqueName: \"kubernetes.io/projected/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-kube-api-access-k8tms\") pod \"9ab201e1-9ef3-485b-81f2-0b421dcc66cc\" (UID: \"9ab201e1-9ef3-485b-81f2-0b421dcc66cc\") " Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.135384 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-etc-machine-id\") pod \"9ab201e1-9ef3-485b-81f2-0b421dcc66cc\" (UID: \"9ab201e1-9ef3-485b-81f2-0b421dcc66cc\") " Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.135482 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-config-data-custom\") pod \"9ab201e1-9ef3-485b-81f2-0b421dcc66cc\" (UID: \"9ab201e1-9ef3-485b-81f2-0b421dcc66cc\") " Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.135507 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-scripts\") pod \"9ab201e1-9ef3-485b-81f2-0b421dcc66cc\" (UID: \"9ab201e1-9ef3-485b-81f2-0b421dcc66cc\") " Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.135580 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-config-data\") pod \"9ab201e1-9ef3-485b-81f2-0b421dcc66cc\" (UID: \"9ab201e1-9ef3-485b-81f2-0b421dcc66cc\") " Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.136269 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9ab201e1-9ef3-485b-81f2-0b421dcc66cc" (UID: "9ab201e1-9ef3-485b-81f2-0b421dcc66cc"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.140905 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9ab201e1-9ef3-485b-81f2-0b421dcc66cc" (UID: "9ab201e1-9ef3-485b-81f2-0b421dcc66cc"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.142716 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-scripts" (OuterVolumeSpecName: "scripts") pod "9ab201e1-9ef3-485b-81f2-0b421dcc66cc" (UID: "9ab201e1-9ef3-485b-81f2-0b421dcc66cc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.143526 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-kube-api-access-k8tms" (OuterVolumeSpecName: "kube-api-access-k8tms") pod "9ab201e1-9ef3-485b-81f2-0b421dcc66cc" (UID: "9ab201e1-9ef3-485b-81f2-0b421dcc66cc"). InnerVolumeSpecName "kube-api-access-k8tms". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.206100 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ab201e1-9ef3-485b-81f2-0b421dcc66cc" (UID: "9ab201e1-9ef3-485b-81f2-0b421dcc66cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.237208 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c51760f-5a26-453e-b578-3bc16d784a4a-combined-ca-bundle\") pod \"5c51760f-5a26-453e-b578-3bc16d784a4a\" (UID: \"5c51760f-5a26-453e-b578-3bc16d784a4a\") " Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.237302 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kr9w4\" (UniqueName: \"kubernetes.io/projected/5c51760f-5a26-453e-b578-3bc16d784a4a-kube-api-access-kr9w4\") pod \"5c51760f-5a26-453e-b578-3bc16d784a4a\" (UID: \"5c51760f-5a26-453e-b578-3bc16d784a4a\") " Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.237354 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c51760f-5a26-453e-b578-3bc16d784a4a-run-httpd\") pod \"5c51760f-5a26-453e-b578-3bc16d784a4a\" (UID: \"5c51760f-5a26-453e-b578-3bc16d784a4a\") " Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.237404 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c51760f-5a26-453e-b578-3bc16d784a4a-config-data\") pod \"5c51760f-5a26-453e-b578-3bc16d784a4a\" (UID: \"5c51760f-5a26-453e-b578-3bc16d784a4a\") " Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.237499 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c51760f-5a26-453e-b578-3bc16d784a4a-log-httpd\") pod \"5c51760f-5a26-453e-b578-3bc16d784a4a\" (UID: \"5c51760f-5a26-453e-b578-3bc16d784a4a\") " Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.237553 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c51760f-5a26-453e-b578-3bc16d784a4a-scripts\") pod \"5c51760f-5a26-453e-b578-3bc16d784a4a\" (UID: \"5c51760f-5a26-453e-b578-3bc16d784a4a\") " Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.237579 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5c51760f-5a26-453e-b578-3bc16d784a4a-sg-core-conf-yaml\") pod \"5c51760f-5a26-453e-b578-3bc16d784a4a\" (UID: \"5c51760f-5a26-453e-b578-3bc16d784a4a\") " Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.238225 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.238249 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8tms\" (UniqueName: \"kubernetes.io/projected/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-kube-api-access-k8tms\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.238274 4774 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.238290 4774 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.238302 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.238658 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c51760f-5a26-453e-b578-3bc16d784a4a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5c51760f-5a26-453e-b578-3bc16d784a4a" (UID: "5c51760f-5a26-453e-b578-3bc16d784a4a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.238828 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c51760f-5a26-453e-b578-3bc16d784a4a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5c51760f-5a26-453e-b578-3bc16d784a4a" (UID: "5c51760f-5a26-453e-b578-3bc16d784a4a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.241794 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c51760f-5a26-453e-b578-3bc16d784a4a-kube-api-access-kr9w4" (OuterVolumeSpecName: "kube-api-access-kr9w4") pod "5c51760f-5a26-453e-b578-3bc16d784a4a" (UID: "5c51760f-5a26-453e-b578-3bc16d784a4a"). InnerVolumeSpecName "kube-api-access-kr9w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.242935 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c51760f-5a26-453e-b578-3bc16d784a4a-scripts" (OuterVolumeSpecName: "scripts") pod "5c51760f-5a26-453e-b578-3bc16d784a4a" (UID: "5c51760f-5a26-453e-b578-3bc16d784a4a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.268083 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c51760f-5a26-453e-b578-3bc16d784a4a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5c51760f-5a26-453e-b578-3bc16d784a4a" (UID: "5c51760f-5a26-453e-b578-3bc16d784a4a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.280182 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-config-data" (OuterVolumeSpecName: "config-data") pod "9ab201e1-9ef3-485b-81f2-0b421dcc66cc" (UID: "9ab201e1-9ef3-485b-81f2-0b421dcc66cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.335447 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c51760f-5a26-453e-b578-3bc16d784a4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c51760f-5a26-453e-b578-3bc16d784a4a" (UID: "5c51760f-5a26-453e-b578-3bc16d784a4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.340245 4774 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c51760f-5a26-453e-b578-3bc16d784a4a-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.340278 4774 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c51760f-5a26-453e-b578-3bc16d784a4a-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.340287 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c51760f-5a26-453e-b578-3bc16d784a4a-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.340297 4774 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5c51760f-5a26-453e-b578-3bc16d784a4a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.340310 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab201e1-9ef3-485b-81f2-0b421dcc66cc-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.340320 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c51760f-5a26-453e-b578-3bc16d784a4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.340331 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kr9w4\" (UniqueName: \"kubernetes.io/projected/5c51760f-5a26-453e-b578-3bc16d784a4a-kube-api-access-kr9w4\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.374191 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c51760f-5a26-453e-b578-3bc16d784a4a-config-data" (OuterVolumeSpecName: "config-data") pod "5c51760f-5a26-453e-b578-3bc16d784a4a" (UID: "5c51760f-5a26-453e-b578-3bc16d784a4a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.442103 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c51760f-5a26-453e-b578-3bc16d784a4a-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.450145 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-tggq5"] Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.458092 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-mgszf"] Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.470105 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-8qxhd"] Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.496837 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mgszf" event={"ID":"38b934ad-cf29-40d1-993c-8dcc6b8c0b8c","Type":"ContainerStarted","Data":"7fb9c015f9277ee4891e9109fdcbe04384bc581afe2467c7260fb0597284b9ac"} Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.505459 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9ab201e1-9ef3-485b-81f2-0b421dcc66cc","Type":"ContainerDied","Data":"fe8ef285257cad8445b1f6f1fc80879753383454c33083ccb6fc4bc636252111"} Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.505518 4774 scope.go:117] "RemoveContainer" containerID="cec1c430a7ebe4634f855fdd18afe1dff2fbf97054719268f31d4a4bfdb5c9e9" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.505518 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.520238 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c51760f-5a26-453e-b578-3bc16d784a4a","Type":"ContainerDied","Data":"bdcecca6063c86549888b1bc0fea207b9ea8997d51e258721403e8739a9133ad"} Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.520768 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.531587 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tggq5" event={"ID":"367d8aee-34b7-485e-848a-3e267afa8fd6","Type":"ContainerStarted","Data":"a7e40f08472c61b9efdd17d89fb2d5bd5d8c49dcc8504bb48cbb040463059410"} Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.538782 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bc5bdf456-xt2x4" event={"ID":"871f7d16-54b6-4aa9-8e99-00a888d41f70","Type":"ContainerDied","Data":"e2a6a155f04b0405eecdb9c21f2fbeee3a4261ebaa974d7ee00a51cecd0f7e5d"} Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.538899 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bc5bdf456-xt2x4" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.542092 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-8qxhd" event={"ID":"6dde1c72-df79-4066-837c-0e318b636b73","Type":"ContainerStarted","Data":"47e6c3322f949eedd41a236bd565f18f65739ba57f0071d7afecb01041fef118"} Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.546028 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f975364c-ff2b-49bb-9e5e-c0fea0d15daa","Type":"ContainerStarted","Data":"6ccc32b67699fba7b4d793abb13e43f9afa1d2a37a399c2e3e4f4ed09e4ab710"} Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.555328 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-8gc5c" event={"ID":"3308cda8-c038-4fbc-91ad-824ce2c1d85c","Type":"ContainerDied","Data":"87eaabb98753bc48d382234dd58cd7845b50f26fbc2a03b5aff1016cbf2a8ada"} Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.555541 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-8gc5c" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.586207 4774 scope.go:117] "RemoveContainer" containerID="48840e45353d51bec85377ead7e54d4b8819f3ac710969f2fb5e297e370878d5" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.594999 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.948869862 podStartE2EDuration="14.594980085s" podCreationTimestamp="2025-10-03 15:01:43 +0000 UTC" firstStartedPulling="2025-10-03 15:01:43.993900253 +0000 UTC m=+1126.583103705" lastFinishedPulling="2025-10-03 15:01:56.640010486 +0000 UTC m=+1139.229213928" observedRunningTime="2025-10-03 15:01:57.589529959 +0000 UTC m=+1140.178733411" watchObservedRunningTime="2025-10-03 15:01:57.594980085 +0000 UTC m=+1140.184183537" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.622680 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-bc5bdf456-xt2x4"] Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.636500 4774 scope.go:117] "RemoveContainer" containerID="929567c8817b96e7c982eb4b05370573427dbb7f549f86e306e63a28d248bac1" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.641342 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-bc5bdf456-xt2x4"] Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.665317 4774 scope.go:117] "RemoveContainer" containerID="558706c0c65d7eb0e5d8e36f2210d186dcc18a506673e23cbe79dcc06a747ecf" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.665455 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.690326 4774 scope.go:117] "RemoveContainer" containerID="858c84834eaf180af71bc0ede5599a2221a672db851281feecbd46e4d4d042c7" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.699281 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.711422 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.722399 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.723908 4774 scope.go:117] "RemoveContainer" containerID="2ab27bcf9d1600d0ab6f3b362911893a98a6cf7d2181a8b461f383fbca930634" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.728888 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:01:57 crc kubenswrapper[4774]: E1003 15:01:57.729270 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="871f7d16-54b6-4aa9-8e99-00a888d41f70" containerName="horizon" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.729282 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="871f7d16-54b6-4aa9-8e99-00a888d41f70" containerName="horizon" Oct 03 15:01:57 crc kubenswrapper[4774]: E1003 15:01:57.729310 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c51760f-5a26-453e-b578-3bc16d784a4a" containerName="proxy-httpd" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.729316 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c51760f-5a26-453e-b578-3bc16d784a4a" containerName="proxy-httpd" Oct 03 15:01:57 crc kubenswrapper[4774]: E1003 15:01:57.729331 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3308cda8-c038-4fbc-91ad-824ce2c1d85c" containerName="dnsmasq-dns" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.729337 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="3308cda8-c038-4fbc-91ad-824ce2c1d85c" containerName="dnsmasq-dns" Oct 03 15:01:57 crc kubenswrapper[4774]: E1003 15:01:57.729348 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c51760f-5a26-453e-b578-3bc16d784a4a" containerName="sg-core" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.729355 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c51760f-5a26-453e-b578-3bc16d784a4a" containerName="sg-core" Oct 03 15:01:57 crc kubenswrapper[4774]: E1003 15:01:57.729366 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ab201e1-9ef3-485b-81f2-0b421dcc66cc" containerName="cinder-scheduler" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.729382 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ab201e1-9ef3-485b-81f2-0b421dcc66cc" containerName="cinder-scheduler" Oct 03 15:01:57 crc kubenswrapper[4774]: E1003 15:01:57.729393 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c51760f-5a26-453e-b578-3bc16d784a4a" containerName="ceilometer-notification-agent" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.729399 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c51760f-5a26-453e-b578-3bc16d784a4a" containerName="ceilometer-notification-agent" Oct 03 15:01:57 crc kubenswrapper[4774]: E1003 15:01:57.729410 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="871f7d16-54b6-4aa9-8e99-00a888d41f70" containerName="horizon-log" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.729416 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="871f7d16-54b6-4aa9-8e99-00a888d41f70" containerName="horizon-log" Oct 03 15:01:57 crc kubenswrapper[4774]: E1003 15:01:57.729429 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ab201e1-9ef3-485b-81f2-0b421dcc66cc" containerName="probe" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.729435 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ab201e1-9ef3-485b-81f2-0b421dcc66cc" containerName="probe" Oct 03 15:01:57 crc kubenswrapper[4774]: E1003 15:01:57.729449 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c51760f-5a26-453e-b578-3bc16d784a4a" containerName="ceilometer-central-agent" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.729455 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c51760f-5a26-453e-b578-3bc16d784a4a" containerName="ceilometer-central-agent" Oct 03 15:01:57 crc kubenswrapper[4774]: E1003 15:01:57.729467 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3308cda8-c038-4fbc-91ad-824ce2c1d85c" containerName="init" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.729473 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="3308cda8-c038-4fbc-91ad-824ce2c1d85c" containerName="init" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.729638 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c51760f-5a26-453e-b578-3bc16d784a4a" containerName="ceilometer-notification-agent" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.729653 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c51760f-5a26-453e-b578-3bc16d784a4a" containerName="proxy-httpd" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.729667 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ab201e1-9ef3-485b-81f2-0b421dcc66cc" containerName="cinder-scheduler" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.729681 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c51760f-5a26-453e-b578-3bc16d784a4a" containerName="ceilometer-central-agent" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.729689 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="871f7d16-54b6-4aa9-8e99-00a888d41f70" containerName="horizon" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.729698 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="3308cda8-c038-4fbc-91ad-824ce2c1d85c" containerName="dnsmasq-dns" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.729707 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="871f7d16-54b6-4aa9-8e99-00a888d41f70" containerName="horizon-log" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.729719 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c51760f-5a26-453e-b578-3bc16d784a4a" containerName="sg-core" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.729730 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ab201e1-9ef3-485b-81f2-0b421dcc66cc" containerName="probe" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.731295 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.734530 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.734737 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.738343 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.739983 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.743642 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.744018 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-8gc5c"] Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.750947 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.765304 4774 scope.go:117] "RemoveContainer" containerID="af677314a00ddf8adf8d016fd4a1bdce9452562c501a73facceeff916c357b0b" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.767754 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-8gc5c"] Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.778250 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.846834 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccvns\" (UniqueName: \"kubernetes.io/projected/a370be32-2d52-48b7-b529-53e1d92a89a9-kube-api-access-ccvns\") pod \"cinder-scheduler-0\" (UID: \"a370be32-2d52-48b7-b529-53e1d92a89a9\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.847031 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/57e8d317-9bdc-4b6f-8705-92369e72c67a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"57e8d317-9bdc-4b6f-8705-92369e72c67a\") " pod="openstack/ceilometer-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.847180 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a370be32-2d52-48b7-b529-53e1d92a89a9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a370be32-2d52-48b7-b529-53e1d92a89a9\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.847292 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a370be32-2d52-48b7-b529-53e1d92a89a9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a370be32-2d52-48b7-b529-53e1d92a89a9\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.847325 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgp55\" (UniqueName: \"kubernetes.io/projected/57e8d317-9bdc-4b6f-8705-92369e72c67a-kube-api-access-pgp55\") pod \"ceilometer-0\" (UID: \"57e8d317-9bdc-4b6f-8705-92369e72c67a\") " pod="openstack/ceilometer-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.847552 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57e8d317-9bdc-4b6f-8705-92369e72c67a-scripts\") pod \"ceilometer-0\" (UID: \"57e8d317-9bdc-4b6f-8705-92369e72c67a\") " pod="openstack/ceilometer-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.847588 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57e8d317-9bdc-4b6f-8705-92369e72c67a-log-httpd\") pod \"ceilometer-0\" (UID: \"57e8d317-9bdc-4b6f-8705-92369e72c67a\") " pod="openstack/ceilometer-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.847612 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57e8d317-9bdc-4b6f-8705-92369e72c67a-config-data\") pod \"ceilometer-0\" (UID: \"57e8d317-9bdc-4b6f-8705-92369e72c67a\") " pod="openstack/ceilometer-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.847735 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a370be32-2d52-48b7-b529-53e1d92a89a9-config-data\") pod \"cinder-scheduler-0\" (UID: \"a370be32-2d52-48b7-b529-53e1d92a89a9\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.847812 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a370be32-2d52-48b7-b529-53e1d92a89a9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a370be32-2d52-48b7-b529-53e1d92a89a9\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.847850 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57e8d317-9bdc-4b6f-8705-92369e72c67a-run-httpd\") pod \"ceilometer-0\" (UID: \"57e8d317-9bdc-4b6f-8705-92369e72c67a\") " pod="openstack/ceilometer-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.847884 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57e8d317-9bdc-4b6f-8705-92369e72c67a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"57e8d317-9bdc-4b6f-8705-92369e72c67a\") " pod="openstack/ceilometer-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.847924 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a370be32-2d52-48b7-b529-53e1d92a89a9-scripts\") pod \"cinder-scheduler-0\" (UID: \"a370be32-2d52-48b7-b529-53e1d92a89a9\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.948599 4774 scope.go:117] "RemoveContainer" containerID="4c3807337ee7f2b8e44c4122821db338ac04f20ba3e7962936f507b835bfe6b9" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.949980 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57e8d317-9bdc-4b6f-8705-92369e72c67a-scripts\") pod \"ceilometer-0\" (UID: \"57e8d317-9bdc-4b6f-8705-92369e72c67a\") " pod="openstack/ceilometer-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.950026 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57e8d317-9bdc-4b6f-8705-92369e72c67a-log-httpd\") pod \"ceilometer-0\" (UID: \"57e8d317-9bdc-4b6f-8705-92369e72c67a\") " pod="openstack/ceilometer-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.950051 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57e8d317-9bdc-4b6f-8705-92369e72c67a-config-data\") pod \"ceilometer-0\" (UID: \"57e8d317-9bdc-4b6f-8705-92369e72c67a\") " pod="openstack/ceilometer-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.950093 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a370be32-2d52-48b7-b529-53e1d92a89a9-config-data\") pod \"cinder-scheduler-0\" (UID: \"a370be32-2d52-48b7-b529-53e1d92a89a9\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.950138 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57e8d317-9bdc-4b6f-8705-92369e72c67a-run-httpd\") pod \"ceilometer-0\" (UID: \"57e8d317-9bdc-4b6f-8705-92369e72c67a\") " pod="openstack/ceilometer-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.950161 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a370be32-2d52-48b7-b529-53e1d92a89a9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a370be32-2d52-48b7-b529-53e1d92a89a9\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.950185 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57e8d317-9bdc-4b6f-8705-92369e72c67a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"57e8d317-9bdc-4b6f-8705-92369e72c67a\") " pod="openstack/ceilometer-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.950225 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a370be32-2d52-48b7-b529-53e1d92a89a9-scripts\") pod \"cinder-scheduler-0\" (UID: \"a370be32-2d52-48b7-b529-53e1d92a89a9\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.950280 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccvns\" (UniqueName: \"kubernetes.io/projected/a370be32-2d52-48b7-b529-53e1d92a89a9-kube-api-access-ccvns\") pod \"cinder-scheduler-0\" (UID: \"a370be32-2d52-48b7-b529-53e1d92a89a9\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.950320 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/57e8d317-9bdc-4b6f-8705-92369e72c67a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"57e8d317-9bdc-4b6f-8705-92369e72c67a\") " pod="openstack/ceilometer-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.950354 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a370be32-2d52-48b7-b529-53e1d92a89a9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a370be32-2d52-48b7-b529-53e1d92a89a9\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.950476 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a370be32-2d52-48b7-b529-53e1d92a89a9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a370be32-2d52-48b7-b529-53e1d92a89a9\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.950503 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgp55\" (UniqueName: \"kubernetes.io/projected/57e8d317-9bdc-4b6f-8705-92369e72c67a-kube-api-access-pgp55\") pod \"ceilometer-0\" (UID: \"57e8d317-9bdc-4b6f-8705-92369e72c67a\") " pod="openstack/ceilometer-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.950742 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57e8d317-9bdc-4b6f-8705-92369e72c67a-run-httpd\") pod \"ceilometer-0\" (UID: \"57e8d317-9bdc-4b6f-8705-92369e72c67a\") " pod="openstack/ceilometer-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.951344 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57e8d317-9bdc-4b6f-8705-92369e72c67a-log-httpd\") pod \"ceilometer-0\" (UID: \"57e8d317-9bdc-4b6f-8705-92369e72c67a\") " pod="openstack/ceilometer-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.951483 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a370be32-2d52-48b7-b529-53e1d92a89a9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a370be32-2d52-48b7-b529-53e1d92a89a9\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.960897 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57e8d317-9bdc-4b6f-8705-92369e72c67a-scripts\") pod \"ceilometer-0\" (UID: \"57e8d317-9bdc-4b6f-8705-92369e72c67a\") " pod="openstack/ceilometer-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.961475 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a370be32-2d52-48b7-b529-53e1d92a89a9-scripts\") pod \"cinder-scheduler-0\" (UID: \"a370be32-2d52-48b7-b529-53e1d92a89a9\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.967777 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a370be32-2d52-48b7-b529-53e1d92a89a9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a370be32-2d52-48b7-b529-53e1d92a89a9\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.968636 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a370be32-2d52-48b7-b529-53e1d92a89a9-config-data\") pod \"cinder-scheduler-0\" (UID: \"a370be32-2d52-48b7-b529-53e1d92a89a9\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.972573 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57e8d317-9bdc-4b6f-8705-92369e72c67a-config-data\") pod \"ceilometer-0\" (UID: \"57e8d317-9bdc-4b6f-8705-92369e72c67a\") " pod="openstack/ceilometer-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.973628 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57e8d317-9bdc-4b6f-8705-92369e72c67a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"57e8d317-9bdc-4b6f-8705-92369e72c67a\") " pod="openstack/ceilometer-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.975042 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccvns\" (UniqueName: \"kubernetes.io/projected/a370be32-2d52-48b7-b529-53e1d92a89a9-kube-api-access-ccvns\") pod \"cinder-scheduler-0\" (UID: \"a370be32-2d52-48b7-b529-53e1d92a89a9\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.977269 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/57e8d317-9bdc-4b6f-8705-92369e72c67a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"57e8d317-9bdc-4b6f-8705-92369e72c67a\") " pod="openstack/ceilometer-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.978735 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a370be32-2d52-48b7-b529-53e1d92a89a9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a370be32-2d52-48b7-b529-53e1d92a89a9\") " pod="openstack/cinder-scheduler-0" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.980254 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:01:57 crc kubenswrapper[4774]: E1003 15:01:57.981178 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-pgp55], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="57e8d317-9bdc-4b6f-8705-92369e72c67a" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.986551 4774 scope.go:117] "RemoveContainer" containerID="d3b421bd6d3316b3283e7c33022a26900e008c55af8794448be7869d582c2540" Oct 03 15:01:57 crc kubenswrapper[4774]: I1003 15:01:57.987199 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgp55\" (UniqueName: \"kubernetes.io/projected/57e8d317-9bdc-4b6f-8705-92369e72c67a-kube-api-access-pgp55\") pod \"ceilometer-0\" (UID: \"57e8d317-9bdc-4b6f-8705-92369e72c67a\") " pod="openstack/ceilometer-0" Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.016569 4774 scope.go:117] "RemoveContainer" containerID="928acf2f57bdfddefacdbe8fb2150cfeeb1911e55dd2a7b5d1b38200ad166808" Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.064479 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5c49d658df-5r5zg" Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.077489 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.175275 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.179081 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d22e7a67-ce5f-4276-8c81-4ff98ad47524" containerName="glance-log" containerID="cri-o://6649f825244c5a3ab28f6b97034ed4582119b4d3d49dcfc4805e3f1cf51f54d5" gracePeriod=30 Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.179488 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d22e7a67-ce5f-4276-8c81-4ff98ad47524" containerName="glance-httpd" containerID="cri-o://7d261663daa84edecac8cf5c077af37a0c44d06245a72aace4d5d6876340f6c5" gracePeriod=30 Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.573020 4774 generic.go:334] "Generic (PLEG): container finished" podID="367d8aee-34b7-485e-848a-3e267afa8fd6" containerID="5329abe20e4af465ff237a25f80e8f2134a085e9f5caaede1ca3af4df38fd631" exitCode=0 Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.573214 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tggq5" event={"ID":"367d8aee-34b7-485e-848a-3e267afa8fd6","Type":"ContainerDied","Data":"5329abe20e4af465ff237a25f80e8f2134a085e9f5caaede1ca3af4df38fd631"} Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.576608 4774 generic.go:334] "Generic (PLEG): container finished" podID="d22e7a67-ce5f-4276-8c81-4ff98ad47524" containerID="6649f825244c5a3ab28f6b97034ed4582119b4d3d49dcfc4805e3f1cf51f54d5" exitCode=143 Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.576664 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d22e7a67-ce5f-4276-8c81-4ff98ad47524","Type":"ContainerDied","Data":"6649f825244c5a3ab28f6b97034ed4582119b4d3d49dcfc4805e3f1cf51f54d5"} Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.578180 4774 generic.go:334] "Generic (PLEG): container finished" podID="6dde1c72-df79-4066-837c-0e318b636b73" containerID="9c25c44261e089e994c953912b63d0e3c8063795102245e228aa8fbeb35b0508" exitCode=0 Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.578221 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-8qxhd" event={"ID":"6dde1c72-df79-4066-837c-0e318b636b73","Type":"ContainerDied","Data":"9c25c44261e089e994c953912b63d0e3c8063795102245e228aa8fbeb35b0508"} Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.579545 4774 generic.go:334] "Generic (PLEG): container finished" podID="38b934ad-cf29-40d1-993c-8dcc6b8c0b8c" containerID="916528f6fadc363a6d990690802f342d8a661299deb5228e84ec48349173d3ba" exitCode=0 Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.579590 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mgszf" event={"ID":"38b934ad-cf29-40d1-993c-8dcc6b8c0b8c","Type":"ContainerDied","Data":"916528f6fadc363a6d990690802f342d8a661299deb5228e84ec48349173d3ba"} Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.581145 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.595279 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 15:01:58 crc kubenswrapper[4774]: W1003 15:01:58.636803 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda370be32_2d52_48b7_b529_53e1d92a89a9.slice/crio-8a573c24a0736852f2c0c8f69fbd1052b26c5c25ff52cc670698c19b5e3303a6 WatchSource:0}: Error finding container 8a573c24a0736852f2c0c8f69fbd1052b26c5c25ff52cc670698c19b5e3303a6: Status 404 returned error can't find the container with id 8a573c24a0736852f2c0c8f69fbd1052b26c5c25ff52cc670698c19b5e3303a6 Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.644638 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.664010 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgp55\" (UniqueName: \"kubernetes.io/projected/57e8d317-9bdc-4b6f-8705-92369e72c67a-kube-api-access-pgp55\") pod \"57e8d317-9bdc-4b6f-8705-92369e72c67a\" (UID: \"57e8d317-9bdc-4b6f-8705-92369e72c67a\") " Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.664296 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57e8d317-9bdc-4b6f-8705-92369e72c67a-combined-ca-bundle\") pod \"57e8d317-9bdc-4b6f-8705-92369e72c67a\" (UID: \"57e8d317-9bdc-4b6f-8705-92369e72c67a\") " Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.664325 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57e8d317-9bdc-4b6f-8705-92369e72c67a-run-httpd\") pod \"57e8d317-9bdc-4b6f-8705-92369e72c67a\" (UID: \"57e8d317-9bdc-4b6f-8705-92369e72c67a\") " Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.664346 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57e8d317-9bdc-4b6f-8705-92369e72c67a-log-httpd\") pod \"57e8d317-9bdc-4b6f-8705-92369e72c67a\" (UID: \"57e8d317-9bdc-4b6f-8705-92369e72c67a\") " Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.664387 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57e8d317-9bdc-4b6f-8705-92369e72c67a-config-data\") pod \"57e8d317-9bdc-4b6f-8705-92369e72c67a\" (UID: \"57e8d317-9bdc-4b6f-8705-92369e72c67a\") " Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.664513 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57e8d317-9bdc-4b6f-8705-92369e72c67a-scripts\") pod \"57e8d317-9bdc-4b6f-8705-92369e72c67a\" (UID: \"57e8d317-9bdc-4b6f-8705-92369e72c67a\") " Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.664574 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/57e8d317-9bdc-4b6f-8705-92369e72c67a-sg-core-conf-yaml\") pod \"57e8d317-9bdc-4b6f-8705-92369e72c67a\" (UID: \"57e8d317-9bdc-4b6f-8705-92369e72c67a\") " Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.665993 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57e8d317-9bdc-4b6f-8705-92369e72c67a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "57e8d317-9bdc-4b6f-8705-92369e72c67a" (UID: "57e8d317-9bdc-4b6f-8705-92369e72c67a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.666045 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57e8d317-9bdc-4b6f-8705-92369e72c67a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "57e8d317-9bdc-4b6f-8705-92369e72c67a" (UID: "57e8d317-9bdc-4b6f-8705-92369e72c67a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.671995 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57e8d317-9bdc-4b6f-8705-92369e72c67a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "57e8d317-9bdc-4b6f-8705-92369e72c67a" (UID: "57e8d317-9bdc-4b6f-8705-92369e72c67a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.672116 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57e8d317-9bdc-4b6f-8705-92369e72c67a-scripts" (OuterVolumeSpecName: "scripts") pod "57e8d317-9bdc-4b6f-8705-92369e72c67a" (UID: "57e8d317-9bdc-4b6f-8705-92369e72c67a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.672193 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57e8d317-9bdc-4b6f-8705-92369e72c67a-kube-api-access-pgp55" (OuterVolumeSpecName: "kube-api-access-pgp55") pod "57e8d317-9bdc-4b6f-8705-92369e72c67a" (UID: "57e8d317-9bdc-4b6f-8705-92369e72c67a"). InnerVolumeSpecName "kube-api-access-pgp55". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.673659 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57e8d317-9bdc-4b6f-8705-92369e72c67a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57e8d317-9bdc-4b6f-8705-92369e72c67a" (UID: "57e8d317-9bdc-4b6f-8705-92369e72c67a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.678698 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57e8d317-9bdc-4b6f-8705-92369e72c67a-config-data" (OuterVolumeSpecName: "config-data") pod "57e8d317-9bdc-4b6f-8705-92369e72c67a" (UID: "57e8d317-9bdc-4b6f-8705-92369e72c67a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.765995 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57e8d317-9bdc-4b6f-8705-92369e72c67a-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.766033 4774 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/57e8d317-9bdc-4b6f-8705-92369e72c67a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.766045 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgp55\" (UniqueName: \"kubernetes.io/projected/57e8d317-9bdc-4b6f-8705-92369e72c67a-kube-api-access-pgp55\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.766056 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57e8d317-9bdc-4b6f-8705-92369e72c67a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.766064 4774 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57e8d317-9bdc-4b6f-8705-92369e72c67a-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.766073 4774 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57e8d317-9bdc-4b6f-8705-92369e72c67a-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:58 crc kubenswrapper[4774]: I1003 15:01:58.766084 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57e8d317-9bdc-4b6f-8705-92369e72c67a-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.165637 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.302887 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-logs\") pod \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\" (UID: \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\") " Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.302966 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-httpd-run\") pod \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\" (UID: \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\") " Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.303017 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwnl4\" (UniqueName: \"kubernetes.io/projected/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-kube-api-access-dwnl4\") pod \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\" (UID: \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\") " Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.303141 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\" (UID: \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\") " Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.303195 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-combined-ca-bundle\") pod \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\" (UID: \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\") " Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.303215 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-config-data\") pod \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\" (UID: \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\") " Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.303278 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-scripts\") pod \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\" (UID: \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\") " Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.303304 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-public-tls-certs\") pod \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\" (UID: \"5a4f000d-fdd6-46ec-b6b4-55a574ca801d\") " Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.304705 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-logs" (OuterVolumeSpecName: "logs") pod "5a4f000d-fdd6-46ec-b6b4-55a574ca801d" (UID: "5a4f000d-fdd6-46ec-b6b4-55a574ca801d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.304851 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5a4f000d-fdd6-46ec-b6b4-55a574ca801d" (UID: "5a4f000d-fdd6-46ec-b6b4-55a574ca801d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.310976 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-kube-api-access-dwnl4" (OuterVolumeSpecName: "kube-api-access-dwnl4") pod "5a4f000d-fdd6-46ec-b6b4-55a574ca801d" (UID: "5a4f000d-fdd6-46ec-b6b4-55a574ca801d"). InnerVolumeSpecName "kube-api-access-dwnl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.313344 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "5a4f000d-fdd6-46ec-b6b4-55a574ca801d" (UID: "5a4f000d-fdd6-46ec-b6b4-55a574ca801d"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.313657 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3308cda8-c038-4fbc-91ad-824ce2c1d85c" path="/var/lib/kubelet/pods/3308cda8-c038-4fbc-91ad-824ce2c1d85c/volumes" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.314502 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c51760f-5a26-453e-b578-3bc16d784a4a" path="/var/lib/kubelet/pods/5c51760f-5a26-453e-b578-3bc16d784a4a/volumes" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.315909 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="871f7d16-54b6-4aa9-8e99-00a888d41f70" path="/var/lib/kubelet/pods/871f7d16-54b6-4aa9-8e99-00a888d41f70/volumes" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.316601 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ab201e1-9ef3-485b-81f2-0b421dcc66cc" path="/var/lib/kubelet/pods/9ab201e1-9ef3-485b-81f2-0b421dcc66cc/volumes" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.321800 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-scripts" (OuterVolumeSpecName: "scripts") pod "5a4f000d-fdd6-46ec-b6b4-55a574ca801d" (UID: "5a4f000d-fdd6-46ec-b6b4-55a574ca801d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.371333 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a4f000d-fdd6-46ec-b6b4-55a574ca801d" (UID: "5a4f000d-fdd6-46ec-b6b4-55a574ca801d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.405843 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwnl4\" (UniqueName: \"kubernetes.io/projected/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-kube-api-access-dwnl4\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.405897 4774 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.405915 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.405928 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.405940 4774 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-logs\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.405952 4774 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.424940 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5a4f000d-fdd6-46ec-b6b4-55a574ca801d" (UID: "5a4f000d-fdd6-46ec-b6b4-55a574ca801d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.425566 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-config-data" (OuterVolumeSpecName: "config-data") pod "5a4f000d-fdd6-46ec-b6b4-55a574ca801d" (UID: "5a4f000d-fdd6-46ec-b6b4-55a574ca801d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.435071 4774 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.507660 4774 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.508004 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.508022 4774 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a4f000d-fdd6-46ec-b6b4-55a574ca801d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.619178 4774 generic.go:334] "Generic (PLEG): container finished" podID="5a4f000d-fdd6-46ec-b6b4-55a574ca801d" containerID="6dadf5c7a6046e152b21be98bc4a11f540148be2f9f8dbd1628a8d83d1d43ea2" exitCode=0 Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.619276 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5a4f000d-fdd6-46ec-b6b4-55a574ca801d","Type":"ContainerDied","Data":"6dadf5c7a6046e152b21be98bc4a11f540148be2f9f8dbd1628a8d83d1d43ea2"} Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.619319 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5a4f000d-fdd6-46ec-b6b4-55a574ca801d","Type":"ContainerDied","Data":"5c026e66286f4cab647ff08fc98b578a9428bfab2688d1b9902f51799f920575"} Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.619336 4774 scope.go:117] "RemoveContainer" containerID="6dadf5c7a6046e152b21be98bc4a11f540148be2f9f8dbd1628a8d83d1d43ea2" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.619486 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.643712 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a370be32-2d52-48b7-b529-53e1d92a89a9","Type":"ContainerStarted","Data":"fdaef1d746d9eaf39c4722d59ce635ad8c5fc039572f8116983ef9cf0381d1ea"} Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.643754 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a370be32-2d52-48b7-b529-53e1d92a89a9","Type":"ContainerStarted","Data":"8a573c24a0736852f2c0c8f69fbd1052b26c5c25ff52cc670698c19b5e3303a6"} Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.644110 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.706853 4774 scope.go:117] "RemoveContainer" containerID="b463abc99cf516a57bf6debf4dde0935b961cbb0865fa8230aa54cdca3c79d65" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.739356 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.742275 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.763859 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.782871 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:01:59 crc kubenswrapper[4774]: E1003 15:01:59.783335 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a4f000d-fdd6-46ec-b6b4-55a574ca801d" containerName="glance-log" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.783351 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a4f000d-fdd6-46ec-b6b4-55a574ca801d" containerName="glance-log" Oct 03 15:01:59 crc kubenswrapper[4774]: E1003 15:01:59.783383 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a4f000d-fdd6-46ec-b6b4-55a574ca801d" containerName="glance-httpd" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.783389 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a4f000d-fdd6-46ec-b6b4-55a574ca801d" containerName="glance-httpd" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.783605 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a4f000d-fdd6-46ec-b6b4-55a574ca801d" containerName="glance-log" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.783615 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a4f000d-fdd6-46ec-b6b4-55a574ca801d" containerName="glance-httpd" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.785290 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.787049 4774 scope.go:117] "RemoveContainer" containerID="6dadf5c7a6046e152b21be98bc4a11f540148be2f9f8dbd1628a8d83d1d43ea2" Oct 03 15:01:59 crc kubenswrapper[4774]: E1003 15:01:59.787686 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dadf5c7a6046e152b21be98bc4a11f540148be2f9f8dbd1628a8d83d1d43ea2\": container with ID starting with 6dadf5c7a6046e152b21be98bc4a11f540148be2f9f8dbd1628a8d83d1d43ea2 not found: ID does not exist" containerID="6dadf5c7a6046e152b21be98bc4a11f540148be2f9f8dbd1628a8d83d1d43ea2" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.787714 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dadf5c7a6046e152b21be98bc4a11f540148be2f9f8dbd1628a8d83d1d43ea2"} err="failed to get container status \"6dadf5c7a6046e152b21be98bc4a11f540148be2f9f8dbd1628a8d83d1d43ea2\": rpc error: code = NotFound desc = could not find container \"6dadf5c7a6046e152b21be98bc4a11f540148be2f9f8dbd1628a8d83d1d43ea2\": container with ID starting with 6dadf5c7a6046e152b21be98bc4a11f540148be2f9f8dbd1628a8d83d1d43ea2 not found: ID does not exist" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.787732 4774 scope.go:117] "RemoveContainer" containerID="b463abc99cf516a57bf6debf4dde0935b961cbb0865fa8230aa54cdca3c79d65" Oct 03 15:01:59 crc kubenswrapper[4774]: E1003 15:01:59.787966 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b463abc99cf516a57bf6debf4dde0935b961cbb0865fa8230aa54cdca3c79d65\": container with ID starting with b463abc99cf516a57bf6debf4dde0935b961cbb0865fa8230aa54cdca3c79d65 not found: ID does not exist" containerID="b463abc99cf516a57bf6debf4dde0935b961cbb0865fa8230aa54cdca3c79d65" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.787982 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b463abc99cf516a57bf6debf4dde0935b961cbb0865fa8230aa54cdca3c79d65"} err="failed to get container status \"b463abc99cf516a57bf6debf4dde0935b961cbb0865fa8230aa54cdca3c79d65\": rpc error: code = NotFound desc = could not find container \"b463abc99cf516a57bf6debf4dde0935b961cbb0865fa8230aa54cdca3c79d65\": container with ID starting with b463abc99cf516a57bf6debf4dde0935b961cbb0865fa8230aa54cdca3c79d65 not found: ID does not exist" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.788196 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.789712 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.806440 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.828791 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.839440 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.840820 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.842510 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.842679 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.845530 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.918066 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c18749aa-da1a-43fc-9275-5fee0919be21-config-data\") pod \"ceilometer-0\" (UID: \"c18749aa-da1a-43fc-9275-5fee0919be21\") " pod="openstack/ceilometer-0" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.918125 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47nq8\" (UniqueName: \"kubernetes.io/projected/c18749aa-da1a-43fc-9275-5fee0919be21-kube-api-access-47nq8\") pod \"ceilometer-0\" (UID: \"c18749aa-da1a-43fc-9275-5fee0919be21\") " pod="openstack/ceilometer-0" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.918193 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/576a053d-3110-4bf1-a079-512e6bc51cbe-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"576a053d-3110-4bf1-a079-512e6bc51cbe\") " pod="openstack/glance-default-external-api-0" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.918245 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/576a053d-3110-4bf1-a079-512e6bc51cbe-scripts\") pod \"glance-default-external-api-0\" (UID: \"576a053d-3110-4bf1-a079-512e6bc51cbe\") " pod="openstack/glance-default-external-api-0" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.918271 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c18749aa-da1a-43fc-9275-5fee0919be21-scripts\") pod \"ceilometer-0\" (UID: \"c18749aa-da1a-43fc-9275-5fee0919be21\") " pod="openstack/ceilometer-0" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.918306 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/576a053d-3110-4bf1-a079-512e6bc51cbe-config-data\") pod \"glance-default-external-api-0\" (UID: \"576a053d-3110-4bf1-a079-512e6bc51cbe\") " pod="openstack/glance-default-external-api-0" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.918333 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/576a053d-3110-4bf1-a079-512e6bc51cbe-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"576a053d-3110-4bf1-a079-512e6bc51cbe\") " pod="openstack/glance-default-external-api-0" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.918347 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/576a053d-3110-4bf1-a079-512e6bc51cbe-logs\") pod \"glance-default-external-api-0\" (UID: \"576a053d-3110-4bf1-a079-512e6bc51cbe\") " pod="openstack/glance-default-external-api-0" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.918396 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44hqt\" (UniqueName: \"kubernetes.io/projected/576a053d-3110-4bf1-a079-512e6bc51cbe-kube-api-access-44hqt\") pod \"glance-default-external-api-0\" (UID: \"576a053d-3110-4bf1-a079-512e6bc51cbe\") " pod="openstack/glance-default-external-api-0" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.918414 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c18749aa-da1a-43fc-9275-5fee0919be21-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c18749aa-da1a-43fc-9275-5fee0919be21\") " pod="openstack/ceilometer-0" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.918428 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"576a053d-3110-4bf1-a079-512e6bc51cbe\") " pod="openstack/glance-default-external-api-0" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.918465 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/576a053d-3110-4bf1-a079-512e6bc51cbe-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"576a053d-3110-4bf1-a079-512e6bc51cbe\") " pod="openstack/glance-default-external-api-0" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.918485 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c18749aa-da1a-43fc-9275-5fee0919be21-run-httpd\") pod \"ceilometer-0\" (UID: \"c18749aa-da1a-43fc-9275-5fee0919be21\") " pod="openstack/ceilometer-0" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.918512 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c18749aa-da1a-43fc-9275-5fee0919be21-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c18749aa-da1a-43fc-9275-5fee0919be21\") " pod="openstack/ceilometer-0" Oct 03 15:01:59 crc kubenswrapper[4774]: I1003 15:01:59.918548 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c18749aa-da1a-43fc-9275-5fee0919be21-log-httpd\") pod \"ceilometer-0\" (UID: \"c18749aa-da1a-43fc-9275-5fee0919be21\") " pod="openstack/ceilometer-0" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.019640 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/576a053d-3110-4bf1-a079-512e6bc51cbe-scripts\") pod \"glance-default-external-api-0\" (UID: \"576a053d-3110-4bf1-a079-512e6bc51cbe\") " pod="openstack/glance-default-external-api-0" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.019692 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c18749aa-da1a-43fc-9275-5fee0919be21-scripts\") pod \"ceilometer-0\" (UID: \"c18749aa-da1a-43fc-9275-5fee0919be21\") " pod="openstack/ceilometer-0" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.019719 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/576a053d-3110-4bf1-a079-512e6bc51cbe-config-data\") pod \"glance-default-external-api-0\" (UID: \"576a053d-3110-4bf1-a079-512e6bc51cbe\") " pod="openstack/glance-default-external-api-0" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.019744 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/576a053d-3110-4bf1-a079-512e6bc51cbe-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"576a053d-3110-4bf1-a079-512e6bc51cbe\") " pod="openstack/glance-default-external-api-0" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.019760 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/576a053d-3110-4bf1-a079-512e6bc51cbe-logs\") pod \"glance-default-external-api-0\" (UID: \"576a053d-3110-4bf1-a079-512e6bc51cbe\") " pod="openstack/glance-default-external-api-0" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.019784 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44hqt\" (UniqueName: \"kubernetes.io/projected/576a053d-3110-4bf1-a079-512e6bc51cbe-kube-api-access-44hqt\") pod \"glance-default-external-api-0\" (UID: \"576a053d-3110-4bf1-a079-512e6bc51cbe\") " pod="openstack/glance-default-external-api-0" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.019826 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"576a053d-3110-4bf1-a079-512e6bc51cbe\") " pod="openstack/glance-default-external-api-0" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.019841 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c18749aa-da1a-43fc-9275-5fee0919be21-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c18749aa-da1a-43fc-9275-5fee0919be21\") " pod="openstack/ceilometer-0" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.019861 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/576a053d-3110-4bf1-a079-512e6bc51cbe-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"576a053d-3110-4bf1-a079-512e6bc51cbe\") " pod="openstack/glance-default-external-api-0" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.019880 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c18749aa-da1a-43fc-9275-5fee0919be21-run-httpd\") pod \"ceilometer-0\" (UID: \"c18749aa-da1a-43fc-9275-5fee0919be21\") " pod="openstack/ceilometer-0" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.019910 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c18749aa-da1a-43fc-9275-5fee0919be21-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c18749aa-da1a-43fc-9275-5fee0919be21\") " pod="openstack/ceilometer-0" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.019930 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c18749aa-da1a-43fc-9275-5fee0919be21-log-httpd\") pod \"ceilometer-0\" (UID: \"c18749aa-da1a-43fc-9275-5fee0919be21\") " pod="openstack/ceilometer-0" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.019956 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c18749aa-da1a-43fc-9275-5fee0919be21-config-data\") pod \"ceilometer-0\" (UID: \"c18749aa-da1a-43fc-9275-5fee0919be21\") " pod="openstack/ceilometer-0" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.019982 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47nq8\" (UniqueName: \"kubernetes.io/projected/c18749aa-da1a-43fc-9275-5fee0919be21-kube-api-access-47nq8\") pod \"ceilometer-0\" (UID: \"c18749aa-da1a-43fc-9275-5fee0919be21\") " pod="openstack/ceilometer-0" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.020020 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/576a053d-3110-4bf1-a079-512e6bc51cbe-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"576a053d-3110-4bf1-a079-512e6bc51cbe\") " pod="openstack/glance-default-external-api-0" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.021096 4774 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"576a053d-3110-4bf1-a079-512e6bc51cbe\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.024769 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/576a053d-3110-4bf1-a079-512e6bc51cbe-logs\") pod \"glance-default-external-api-0\" (UID: \"576a053d-3110-4bf1-a079-512e6bc51cbe\") " pod="openstack/glance-default-external-api-0" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.025394 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/576a053d-3110-4bf1-a079-512e6bc51cbe-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"576a053d-3110-4bf1-a079-512e6bc51cbe\") " pod="openstack/glance-default-external-api-0" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.026789 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c18749aa-da1a-43fc-9275-5fee0919be21-run-httpd\") pod \"ceilometer-0\" (UID: \"c18749aa-da1a-43fc-9275-5fee0919be21\") " pod="openstack/ceilometer-0" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.027340 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c18749aa-da1a-43fc-9275-5fee0919be21-log-httpd\") pod \"ceilometer-0\" (UID: \"c18749aa-da1a-43fc-9275-5fee0919be21\") " pod="openstack/ceilometer-0" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.030131 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/576a053d-3110-4bf1-a079-512e6bc51cbe-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"576a053d-3110-4bf1-a079-512e6bc51cbe\") " pod="openstack/glance-default-external-api-0" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.033254 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/576a053d-3110-4bf1-a079-512e6bc51cbe-config-data\") pod \"glance-default-external-api-0\" (UID: \"576a053d-3110-4bf1-a079-512e6bc51cbe\") " pod="openstack/glance-default-external-api-0" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.034716 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c18749aa-da1a-43fc-9275-5fee0919be21-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c18749aa-da1a-43fc-9275-5fee0919be21\") " pod="openstack/ceilometer-0" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.038904 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c18749aa-da1a-43fc-9275-5fee0919be21-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c18749aa-da1a-43fc-9275-5fee0919be21\") " pod="openstack/ceilometer-0" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.039664 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c18749aa-da1a-43fc-9275-5fee0919be21-config-data\") pod \"ceilometer-0\" (UID: \"c18749aa-da1a-43fc-9275-5fee0919be21\") " pod="openstack/ceilometer-0" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.039957 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/576a053d-3110-4bf1-a079-512e6bc51cbe-scripts\") pod \"glance-default-external-api-0\" (UID: \"576a053d-3110-4bf1-a079-512e6bc51cbe\") " pod="openstack/glance-default-external-api-0" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.040149 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/576a053d-3110-4bf1-a079-512e6bc51cbe-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"576a053d-3110-4bf1-a079-512e6bc51cbe\") " pod="openstack/glance-default-external-api-0" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.053022 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c18749aa-da1a-43fc-9275-5fee0919be21-scripts\") pod \"ceilometer-0\" (UID: \"c18749aa-da1a-43fc-9275-5fee0919be21\") " pod="openstack/ceilometer-0" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.064096 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44hqt\" (UniqueName: \"kubernetes.io/projected/576a053d-3110-4bf1-a079-512e6bc51cbe-kube-api-access-44hqt\") pod \"glance-default-external-api-0\" (UID: \"576a053d-3110-4bf1-a079-512e6bc51cbe\") " pod="openstack/glance-default-external-api-0" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.065518 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47nq8\" (UniqueName: \"kubernetes.io/projected/c18749aa-da1a-43fc-9275-5fee0919be21-kube-api-access-47nq8\") pod \"ceilometer-0\" (UID: \"c18749aa-da1a-43fc-9275-5fee0919be21\") " pod="openstack/ceilometer-0" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.079853 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"576a053d-3110-4bf1-a079-512e6bc51cbe\") " pod="openstack/glance-default-external-api-0" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.137075 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.184337 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.478565 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tggq5" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.500049 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mgszf" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.513709 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-8qxhd" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.539023 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76f46\" (UniqueName: \"kubernetes.io/projected/367d8aee-34b7-485e-848a-3e267afa8fd6-kube-api-access-76f46\") pod \"367d8aee-34b7-485e-848a-3e267afa8fd6\" (UID: \"367d8aee-34b7-485e-848a-3e267afa8fd6\") " Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.539438 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpxj5\" (UniqueName: \"kubernetes.io/projected/38b934ad-cf29-40d1-993c-8dcc6b8c0b8c-kube-api-access-dpxj5\") pod \"38b934ad-cf29-40d1-993c-8dcc6b8c0b8c\" (UID: \"38b934ad-cf29-40d1-993c-8dcc6b8c0b8c\") " Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.547383 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/367d8aee-34b7-485e-848a-3e267afa8fd6-kube-api-access-76f46" (OuterVolumeSpecName: "kube-api-access-76f46") pod "367d8aee-34b7-485e-848a-3e267afa8fd6" (UID: "367d8aee-34b7-485e-848a-3e267afa8fd6"). InnerVolumeSpecName "kube-api-access-76f46". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.546823 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38b934ad-cf29-40d1-993c-8dcc6b8c0b8c-kube-api-access-dpxj5" (OuterVolumeSpecName: "kube-api-access-dpxj5") pod "38b934ad-cf29-40d1-993c-8dcc6b8c0b8c" (UID: "38b934ad-cf29-40d1-993c-8dcc6b8c0b8c"). InnerVolumeSpecName "kube-api-access-dpxj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.645064 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xltn2\" (UniqueName: \"kubernetes.io/projected/6dde1c72-df79-4066-837c-0e318b636b73-kube-api-access-xltn2\") pod \"6dde1c72-df79-4066-837c-0e318b636b73\" (UID: \"6dde1c72-df79-4066-837c-0e318b636b73\") " Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.646569 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76f46\" (UniqueName: \"kubernetes.io/projected/367d8aee-34b7-485e-848a-3e267afa8fd6-kube-api-access-76f46\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.646586 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpxj5\" (UniqueName: \"kubernetes.io/projected/38b934ad-cf29-40d1-993c-8dcc6b8c0b8c-kube-api-access-dpxj5\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.649560 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dde1c72-df79-4066-837c-0e318b636b73-kube-api-access-xltn2" (OuterVolumeSpecName: "kube-api-access-xltn2") pod "6dde1c72-df79-4066-837c-0e318b636b73" (UID: "6dde1c72-df79-4066-837c-0e318b636b73"). InnerVolumeSpecName "kube-api-access-xltn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.659413 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-8qxhd" event={"ID":"6dde1c72-df79-4066-837c-0e318b636b73","Type":"ContainerDied","Data":"47e6c3322f949eedd41a236bd565f18f65739ba57f0071d7afecb01041fef118"} Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.659550 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47e6c3322f949eedd41a236bd565f18f65739ba57f0071d7afecb01041fef118" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.659664 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-8qxhd" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.664224 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mgszf" event={"ID":"38b934ad-cf29-40d1-993c-8dcc6b8c0b8c","Type":"ContainerDied","Data":"7fb9c015f9277ee4891e9109fdcbe04384bc581afe2467c7260fb0597284b9ac"} Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.664251 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fb9c015f9277ee4891e9109fdcbe04384bc581afe2467c7260fb0597284b9ac" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.664292 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mgszf" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.670557 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a370be32-2d52-48b7-b529-53e1d92a89a9","Type":"ContainerStarted","Data":"8b3c69940681c97546b64ba65e9e2b79b873246b4abe942fbf802f6c721d6200"} Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.671549 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tggq5" event={"ID":"367d8aee-34b7-485e-848a-3e267afa8fd6","Type":"ContainerDied","Data":"a7e40f08472c61b9efdd17d89fb2d5bd5d8c49dcc8504bb48cbb040463059410"} Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.671573 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7e40f08472c61b9efdd17d89fb2d5bd5d8c49dcc8504bb48cbb040463059410" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.671625 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tggq5" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.694620 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.694601181 podStartE2EDuration="3.694601181s" podCreationTimestamp="2025-10-03 15:01:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:02:00.687191447 +0000 UTC m=+1143.276394919" watchObservedRunningTime="2025-10-03 15:02:00.694601181 +0000 UTC m=+1143.283804623" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.739339 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8b5c85b87-8gc5c" podUID="3308cda8-c038-4fbc-91ad-824ce2c1d85c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.138:5353: i/o timeout" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.749230 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xltn2\" (UniqueName: \"kubernetes.io/projected/6dde1c72-df79-4066-837c-0e318b636b73-kube-api-access-xltn2\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.778248 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:02:00 crc kubenswrapper[4774]: W1003 15:02:00.861110 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod576a053d_3110_4bf1_a079_512e6bc51cbe.slice/crio-17af7a8790a505c27ca87ae25c730b65d311eaa7bd22f5d8fdf2e5f603b7e15e WatchSource:0}: Error finding container 17af7a8790a505c27ca87ae25c730b65d311eaa7bd22f5d8fdf2e5f603b7e15e: Status 404 returned error can't find the container with id 17af7a8790a505c27ca87ae25c730b65d311eaa7bd22f5d8fdf2e5f603b7e15e Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.862309 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 15:02:00 crc kubenswrapper[4774]: I1003 15:02:00.938487 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:02:01 crc kubenswrapper[4774]: I1003 15:02:01.315032 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57e8d317-9bdc-4b6f-8705-92369e72c67a" path="/var/lib/kubelet/pods/57e8d317-9bdc-4b6f-8705-92369e72c67a/volumes" Oct 03 15:02:01 crc kubenswrapper[4774]: I1003 15:02:01.315926 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a4f000d-fdd6-46ec-b6b4-55a574ca801d" path="/var/lib/kubelet/pods/5a4f000d-fdd6-46ec-b6b4-55a574ca801d/volumes" Oct 03 15:02:01 crc kubenswrapper[4774]: I1003 15:02:01.733821 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"576a053d-3110-4bf1-a079-512e6bc51cbe","Type":"ContainerStarted","Data":"7e8db326eb838df260d5f2b39483f49444e8886ae07646d0bd72182d09eaa590"} Oct 03 15:02:01 crc kubenswrapper[4774]: I1003 15:02:01.734179 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"576a053d-3110-4bf1-a079-512e6bc51cbe","Type":"ContainerStarted","Data":"17af7a8790a505c27ca87ae25c730b65d311eaa7bd22f5d8fdf2e5f603b7e15e"} Oct 03 15:02:01 crc kubenswrapper[4774]: I1003 15:02:01.734994 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c18749aa-da1a-43fc-9275-5fee0919be21","Type":"ContainerStarted","Data":"b59964bf9a5519a266f5267b6662dd42461eb7ddd85043e6846b2ba045d54caf"} Oct 03 15:02:01 crc kubenswrapper[4774]: I1003 15:02:01.742780 4774 generic.go:334] "Generic (PLEG): container finished" podID="d22e7a67-ce5f-4276-8c81-4ff98ad47524" containerID="7d261663daa84edecac8cf5c077af37a0c44d06245a72aace4d5d6876340f6c5" exitCode=0 Oct 03 15:02:01 crc kubenswrapper[4774]: I1003 15:02:01.743712 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d22e7a67-ce5f-4276-8c81-4ff98ad47524","Type":"ContainerDied","Data":"7d261663daa84edecac8cf5c077af37a0c44d06245a72aace4d5d6876340f6c5"} Oct 03 15:02:01 crc kubenswrapper[4774]: I1003 15:02:01.993021 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.071943 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d22e7a67-ce5f-4276-8c81-4ff98ad47524-logs\") pod \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\" (UID: \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\") " Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.072078 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22e7a67-ce5f-4276-8c81-4ff98ad47524-config-data\") pod \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\" (UID: \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\") " Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.072142 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74fr5\" (UniqueName: \"kubernetes.io/projected/d22e7a67-ce5f-4276-8c81-4ff98ad47524-kube-api-access-74fr5\") pod \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\" (UID: \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\") " Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.072196 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\" (UID: \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\") " Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.072269 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d22e7a67-ce5f-4276-8c81-4ff98ad47524-scripts\") pod \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\" (UID: \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\") " Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.072299 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d22e7a67-ce5f-4276-8c81-4ff98ad47524-combined-ca-bundle\") pod \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\" (UID: \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\") " Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.072394 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d22e7a67-ce5f-4276-8c81-4ff98ad47524-internal-tls-certs\") pod \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\" (UID: \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\") " Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.072445 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d22e7a67-ce5f-4276-8c81-4ff98ad47524-httpd-run\") pod \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\" (UID: \"d22e7a67-ce5f-4276-8c81-4ff98ad47524\") " Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.073195 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d22e7a67-ce5f-4276-8c81-4ff98ad47524-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d22e7a67-ce5f-4276-8c81-4ff98ad47524" (UID: "d22e7a67-ce5f-4276-8c81-4ff98ad47524"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.073461 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d22e7a67-ce5f-4276-8c81-4ff98ad47524-logs" (OuterVolumeSpecName: "logs") pod "d22e7a67-ce5f-4276-8c81-4ff98ad47524" (UID: "d22e7a67-ce5f-4276-8c81-4ff98ad47524"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.080617 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "d22e7a67-ce5f-4276-8c81-4ff98ad47524" (UID: "d22e7a67-ce5f-4276-8c81-4ff98ad47524"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.091236 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d22e7a67-ce5f-4276-8c81-4ff98ad47524-kube-api-access-74fr5" (OuterVolumeSpecName: "kube-api-access-74fr5") pod "d22e7a67-ce5f-4276-8c81-4ff98ad47524" (UID: "d22e7a67-ce5f-4276-8c81-4ff98ad47524"). InnerVolumeSpecName "kube-api-access-74fr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.101556 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d22e7a67-ce5f-4276-8c81-4ff98ad47524-scripts" (OuterVolumeSpecName: "scripts") pod "d22e7a67-ce5f-4276-8c81-4ff98ad47524" (UID: "d22e7a67-ce5f-4276-8c81-4ff98ad47524"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.149140 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d22e7a67-ce5f-4276-8c81-4ff98ad47524-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d22e7a67-ce5f-4276-8c81-4ff98ad47524" (UID: "d22e7a67-ce5f-4276-8c81-4ff98ad47524"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.173837 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d22e7a67-ce5f-4276-8c81-4ff98ad47524-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d22e7a67-ce5f-4276-8c81-4ff98ad47524" (UID: "d22e7a67-ce5f-4276-8c81-4ff98ad47524"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.174224 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d22e7a67-ce5f-4276-8c81-4ff98ad47524-config-data" (OuterVolumeSpecName: "config-data") pod "d22e7a67-ce5f-4276-8c81-4ff98ad47524" (UID: "d22e7a67-ce5f-4276-8c81-4ff98ad47524"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.174980 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d22e7a67-ce5f-4276-8c81-4ff98ad47524-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.175076 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d22e7a67-ce5f-4276-8c81-4ff98ad47524-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.175137 4774 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d22e7a67-ce5f-4276-8c81-4ff98ad47524-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.175191 4774 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d22e7a67-ce5f-4276-8c81-4ff98ad47524-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.175244 4774 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d22e7a67-ce5f-4276-8c81-4ff98ad47524-logs\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.175295 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22e7a67-ce5f-4276-8c81-4ff98ad47524-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.175354 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74fr5\" (UniqueName: \"kubernetes.io/projected/d22e7a67-ce5f-4276-8c81-4ff98ad47524-kube-api-access-74fr5\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.175436 4774 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.203513 4774 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.276910 4774 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.755034 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d22e7a67-ce5f-4276-8c81-4ff98ad47524","Type":"ContainerDied","Data":"4e22c7a548eb1c3c4a50856562b595d69f813a1f922151e747a321dd48b4ada3"} Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.755271 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.755340 4774 scope.go:117] "RemoveContainer" containerID="7d261663daa84edecac8cf5c077af37a0c44d06245a72aace4d5d6876340f6c5" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.758610 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c18749aa-da1a-43fc-9275-5fee0919be21","Type":"ContainerStarted","Data":"d01b66f9a28c0a083d17709cb4b1d3cb3c00444ea223bbef77817fcd4f26c71a"} Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.758656 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c18749aa-da1a-43fc-9275-5fee0919be21","Type":"ContainerStarted","Data":"8be38e9e06abbf97ebb3732df0e30b8546e9e8a88ff6297fe72171cba2fda635"} Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.764493 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"576a053d-3110-4bf1-a079-512e6bc51cbe","Type":"ContainerStarted","Data":"74415d959f6ec6ba269cf9e605b50d5ec7554bc0cb855dbc1c457adf0bda04af"} Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.793260 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.793492 4774 scope.go:117] "RemoveContainer" containerID="6649f825244c5a3ab28f6b97034ed4582119b4d3d49dcfc4805e3f1cf51f54d5" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.805497 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.848419 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 15:02:02 crc kubenswrapper[4774]: E1003 15:02:02.849495 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38b934ad-cf29-40d1-993c-8dcc6b8c0b8c" containerName="mariadb-database-create" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.849515 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="38b934ad-cf29-40d1-993c-8dcc6b8c0b8c" containerName="mariadb-database-create" Oct 03 15:02:02 crc kubenswrapper[4774]: E1003 15:02:02.849543 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="367d8aee-34b7-485e-848a-3e267afa8fd6" containerName="mariadb-database-create" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.849550 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="367d8aee-34b7-485e-848a-3e267afa8fd6" containerName="mariadb-database-create" Oct 03 15:02:02 crc kubenswrapper[4774]: E1003 15:02:02.849568 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d22e7a67-ce5f-4276-8c81-4ff98ad47524" containerName="glance-httpd" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.849574 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="d22e7a67-ce5f-4276-8c81-4ff98ad47524" containerName="glance-httpd" Oct 03 15:02:02 crc kubenswrapper[4774]: E1003 15:02:02.849597 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d22e7a67-ce5f-4276-8c81-4ff98ad47524" containerName="glance-log" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.849603 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="d22e7a67-ce5f-4276-8c81-4ff98ad47524" containerName="glance-log" Oct 03 15:02:02 crc kubenswrapper[4774]: E1003 15:02:02.849614 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dde1c72-df79-4066-837c-0e318b636b73" containerName="mariadb-database-create" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.849621 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dde1c72-df79-4066-837c-0e318b636b73" containerName="mariadb-database-create" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.849967 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="d22e7a67-ce5f-4276-8c81-4ff98ad47524" containerName="glance-httpd" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.849993 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="367d8aee-34b7-485e-848a-3e267afa8fd6" containerName="mariadb-database-create" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.850007 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="38b934ad-cf29-40d1-993c-8dcc6b8c0b8c" containerName="mariadb-database-create" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.850015 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="d22e7a67-ce5f-4276-8c81-4ff98ad47524" containerName="glance-log" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.850040 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dde1c72-df79-4066-837c-0e318b636b73" containerName="mariadb-database-create" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.852475 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.856241 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.857085 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.885355 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.88531921 podStartE2EDuration="3.88531921s" podCreationTimestamp="2025-10-03 15:01:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:02:02.84618549 +0000 UTC m=+1145.435388942" watchObservedRunningTime="2025-10-03 15:02:02.88531921 +0000 UTC m=+1145.474522662" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.913635 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.999429 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"24f9335f-6756-4590-9e0a-6bc7bd1f4b3e\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.999530 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24f9335f-6756-4590-9e0a-6bc7bd1f4b3e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"24f9335f-6756-4590-9e0a-6bc7bd1f4b3e\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.999582 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24f9335f-6756-4590-9e0a-6bc7bd1f4b3e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"24f9335f-6756-4590-9e0a-6bc7bd1f4b3e\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.999605 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24f9335f-6756-4590-9e0a-6bc7bd1f4b3e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"24f9335f-6756-4590-9e0a-6bc7bd1f4b3e\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.999668 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24f9335f-6756-4590-9e0a-6bc7bd1f4b3e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"24f9335f-6756-4590-9e0a-6bc7bd1f4b3e\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.999696 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24f9335f-6756-4590-9e0a-6bc7bd1f4b3e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"24f9335f-6756-4590-9e0a-6bc7bd1f4b3e\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.999800 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg297\" (UniqueName: \"kubernetes.io/projected/24f9335f-6756-4590-9e0a-6bc7bd1f4b3e-kube-api-access-tg297\") pod \"glance-default-internal-api-0\" (UID: \"24f9335f-6756-4590-9e0a-6bc7bd1f4b3e\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:02:02 crc kubenswrapper[4774]: I1003 15:02:02.999849 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24f9335f-6756-4590-9e0a-6bc7bd1f4b3e-logs\") pod \"glance-default-internal-api-0\" (UID: \"24f9335f-6756-4590-9e0a-6bc7bd1f4b3e\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:02:03 crc kubenswrapper[4774]: I1003 15:02:03.077857 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 03 15:02:03 crc kubenswrapper[4774]: I1003 15:02:03.102219 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24f9335f-6756-4590-9e0a-6bc7bd1f4b3e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"24f9335f-6756-4590-9e0a-6bc7bd1f4b3e\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:02:03 crc kubenswrapper[4774]: I1003 15:02:03.102267 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24f9335f-6756-4590-9e0a-6bc7bd1f4b3e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"24f9335f-6756-4590-9e0a-6bc7bd1f4b3e\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:02:03 crc kubenswrapper[4774]: I1003 15:02:03.102336 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24f9335f-6756-4590-9e0a-6bc7bd1f4b3e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"24f9335f-6756-4590-9e0a-6bc7bd1f4b3e\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:02:03 crc kubenswrapper[4774]: I1003 15:02:03.102364 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24f9335f-6756-4590-9e0a-6bc7bd1f4b3e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"24f9335f-6756-4590-9e0a-6bc7bd1f4b3e\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:02:03 crc kubenswrapper[4774]: I1003 15:02:03.102471 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg297\" (UniqueName: \"kubernetes.io/projected/24f9335f-6756-4590-9e0a-6bc7bd1f4b3e-kube-api-access-tg297\") pod \"glance-default-internal-api-0\" (UID: \"24f9335f-6756-4590-9e0a-6bc7bd1f4b3e\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:02:03 crc kubenswrapper[4774]: I1003 15:02:03.102509 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24f9335f-6756-4590-9e0a-6bc7bd1f4b3e-logs\") pod \"glance-default-internal-api-0\" (UID: \"24f9335f-6756-4590-9e0a-6bc7bd1f4b3e\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:02:03 crc kubenswrapper[4774]: I1003 15:02:03.102552 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"24f9335f-6756-4590-9e0a-6bc7bd1f4b3e\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:02:03 crc kubenswrapper[4774]: I1003 15:02:03.102936 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24f9335f-6756-4590-9e0a-6bc7bd1f4b3e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"24f9335f-6756-4590-9e0a-6bc7bd1f4b3e\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:02:03 crc kubenswrapper[4774]: I1003 15:02:03.103081 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24f9335f-6756-4590-9e0a-6bc7bd1f4b3e-logs\") pod \"glance-default-internal-api-0\" (UID: \"24f9335f-6756-4590-9e0a-6bc7bd1f4b3e\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:02:03 crc kubenswrapper[4774]: I1003 15:02:03.103242 4774 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"24f9335f-6756-4590-9e0a-6bc7bd1f4b3e\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Oct 03 15:02:03 crc kubenswrapper[4774]: I1003 15:02:03.103409 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24f9335f-6756-4590-9e0a-6bc7bd1f4b3e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"24f9335f-6756-4590-9e0a-6bc7bd1f4b3e\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:02:03 crc kubenswrapper[4774]: I1003 15:02:03.107784 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24f9335f-6756-4590-9e0a-6bc7bd1f4b3e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"24f9335f-6756-4590-9e0a-6bc7bd1f4b3e\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:02:03 crc kubenswrapper[4774]: I1003 15:02:03.121810 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24f9335f-6756-4590-9e0a-6bc7bd1f4b3e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"24f9335f-6756-4590-9e0a-6bc7bd1f4b3e\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:02:03 crc kubenswrapper[4774]: I1003 15:02:03.129334 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24f9335f-6756-4590-9e0a-6bc7bd1f4b3e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"24f9335f-6756-4590-9e0a-6bc7bd1f4b3e\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:02:03 crc kubenswrapper[4774]: I1003 15:02:03.130000 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg297\" (UniqueName: \"kubernetes.io/projected/24f9335f-6756-4590-9e0a-6bc7bd1f4b3e-kube-api-access-tg297\") pod \"glance-default-internal-api-0\" (UID: \"24f9335f-6756-4590-9e0a-6bc7bd1f4b3e\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:02:03 crc kubenswrapper[4774]: I1003 15:02:03.134243 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24f9335f-6756-4590-9e0a-6bc7bd1f4b3e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"24f9335f-6756-4590-9e0a-6bc7bd1f4b3e\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:02:03 crc kubenswrapper[4774]: I1003 15:02:03.147819 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"24f9335f-6756-4590-9e0a-6bc7bd1f4b3e\") " pod="openstack/glance-default-internal-api-0" Oct 03 15:02:03 crc kubenswrapper[4774]: I1003 15:02:03.184424 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 15:02:03 crc kubenswrapper[4774]: I1003 15:02:03.323862 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d22e7a67-ce5f-4276-8c81-4ff98ad47524" path="/var/lib/kubelet/pods/d22e7a67-ce5f-4276-8c81-4ff98ad47524/volumes" Oct 03 15:02:03 crc kubenswrapper[4774]: I1003 15:02:03.758674 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 15:02:03 crc kubenswrapper[4774]: I1003 15:02:03.789268 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"24f9335f-6756-4590-9e0a-6bc7bd1f4b3e","Type":"ContainerStarted","Data":"95c4ccff7429608a873d62d382811aeef4cb39e7730e3781120ee88d0c7ea8f8"} Oct 03 15:02:03 crc kubenswrapper[4774]: I1003 15:02:03.792531 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c18749aa-da1a-43fc-9275-5fee0919be21","Type":"ContainerStarted","Data":"319e8d8031d37718f8a89f9a10ac3508af3ef3a0a102bb021cdd4e6172312a90"} Oct 03 15:02:04 crc kubenswrapper[4774]: I1003 15:02:04.395351 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-22ec-account-create-tbdq7"] Oct 03 15:02:04 crc kubenswrapper[4774]: I1003 15:02:04.397070 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-22ec-account-create-tbdq7" Oct 03 15:02:04 crc kubenswrapper[4774]: I1003 15:02:04.400344 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 03 15:02:04 crc kubenswrapper[4774]: I1003 15:02:04.404062 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-22ec-account-create-tbdq7"] Oct 03 15:02:04 crc kubenswrapper[4774]: I1003 15:02:04.538106 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2fzk\" (UniqueName: \"kubernetes.io/projected/bc84e6e6-c0c7-44f6-8834-0e9f5e52f98c-kube-api-access-x2fzk\") pod \"nova-api-22ec-account-create-tbdq7\" (UID: \"bc84e6e6-c0c7-44f6-8834-0e9f5e52f98c\") " pod="openstack/nova-api-22ec-account-create-tbdq7" Oct 03 15:02:04 crc kubenswrapper[4774]: I1003 15:02:04.598088 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-abbe-account-create-tmxqt"] Oct 03 15:02:04 crc kubenswrapper[4774]: I1003 15:02:04.605951 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-abbe-account-create-tmxqt" Oct 03 15:02:04 crc kubenswrapper[4774]: I1003 15:02:04.608431 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 03 15:02:04 crc kubenswrapper[4774]: I1003 15:02:04.616516 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-abbe-account-create-tmxqt"] Oct 03 15:02:04 crc kubenswrapper[4774]: I1003 15:02:04.640254 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2fzk\" (UniqueName: \"kubernetes.io/projected/bc84e6e6-c0c7-44f6-8834-0e9f5e52f98c-kube-api-access-x2fzk\") pod \"nova-api-22ec-account-create-tbdq7\" (UID: \"bc84e6e6-c0c7-44f6-8834-0e9f5e52f98c\") " pod="openstack/nova-api-22ec-account-create-tbdq7" Oct 03 15:02:04 crc kubenswrapper[4774]: I1003 15:02:04.677335 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2fzk\" (UniqueName: \"kubernetes.io/projected/bc84e6e6-c0c7-44f6-8834-0e9f5e52f98c-kube-api-access-x2fzk\") pod \"nova-api-22ec-account-create-tbdq7\" (UID: \"bc84e6e6-c0c7-44f6-8834-0e9f5e52f98c\") " pod="openstack/nova-api-22ec-account-create-tbdq7" Oct 03 15:02:04 crc kubenswrapper[4774]: I1003 15:02:04.741559 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns2kb\" (UniqueName: \"kubernetes.io/projected/55fbe210-dfe5-4488-9461-8b1f67d30f49-kube-api-access-ns2kb\") pod \"nova-cell0-abbe-account-create-tmxqt\" (UID: \"55fbe210-dfe5-4488-9461-8b1f67d30f49\") " pod="openstack/nova-cell0-abbe-account-create-tmxqt" Oct 03 15:02:04 crc kubenswrapper[4774]: I1003 15:02:04.771918 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-22ec-account-create-tbdq7" Oct 03 15:02:04 crc kubenswrapper[4774]: I1003 15:02:04.795650 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-d56e-account-create-h6kfv"] Oct 03 15:02:04 crc kubenswrapper[4774]: I1003 15:02:04.797177 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d56e-account-create-h6kfv" Oct 03 15:02:04 crc kubenswrapper[4774]: I1003 15:02:04.801517 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 03 15:02:04 crc kubenswrapper[4774]: I1003 15:02:04.804326 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-d56e-account-create-h6kfv"] Oct 03 15:02:04 crc kubenswrapper[4774]: I1003 15:02:04.817483 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"24f9335f-6756-4590-9e0a-6bc7bd1f4b3e","Type":"ContainerStarted","Data":"cb25b46e39d8e9837f175ce46553853adb7839de924cc293c7d42cc7d19dd049"} Oct 03 15:02:04 crc kubenswrapper[4774]: I1003 15:02:04.849920 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns2kb\" (UniqueName: \"kubernetes.io/projected/55fbe210-dfe5-4488-9461-8b1f67d30f49-kube-api-access-ns2kb\") pod \"nova-cell0-abbe-account-create-tmxqt\" (UID: \"55fbe210-dfe5-4488-9461-8b1f67d30f49\") " pod="openstack/nova-cell0-abbe-account-create-tmxqt" Oct 03 15:02:04 crc kubenswrapper[4774]: I1003 15:02:04.870000 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns2kb\" (UniqueName: \"kubernetes.io/projected/55fbe210-dfe5-4488-9461-8b1f67d30f49-kube-api-access-ns2kb\") pod \"nova-cell0-abbe-account-create-tmxqt\" (UID: \"55fbe210-dfe5-4488-9461-8b1f67d30f49\") " pod="openstack/nova-cell0-abbe-account-create-tmxqt" Oct 03 15:02:04 crc kubenswrapper[4774]: I1003 15:02:04.932844 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-abbe-account-create-tmxqt" Oct 03 15:02:04 crc kubenswrapper[4774]: I1003 15:02:04.952328 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrg2j\" (UniqueName: \"kubernetes.io/projected/e1965846-33db-41a7-b097-d26e5a398986-kube-api-access-hrg2j\") pod \"nova-cell1-d56e-account-create-h6kfv\" (UID: \"e1965846-33db-41a7-b097-d26e5a398986\") " pod="openstack/nova-cell1-d56e-account-create-h6kfv" Oct 03 15:02:05 crc kubenswrapper[4774]: I1003 15:02:05.054576 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrg2j\" (UniqueName: \"kubernetes.io/projected/e1965846-33db-41a7-b097-d26e5a398986-kube-api-access-hrg2j\") pod \"nova-cell1-d56e-account-create-h6kfv\" (UID: \"e1965846-33db-41a7-b097-d26e5a398986\") " pod="openstack/nova-cell1-d56e-account-create-h6kfv" Oct 03 15:02:05 crc kubenswrapper[4774]: I1003 15:02:05.083234 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrg2j\" (UniqueName: \"kubernetes.io/projected/e1965846-33db-41a7-b097-d26e5a398986-kube-api-access-hrg2j\") pod \"nova-cell1-d56e-account-create-h6kfv\" (UID: \"e1965846-33db-41a7-b097-d26e5a398986\") " pod="openstack/nova-cell1-d56e-account-create-h6kfv" Oct 03 15:02:05 crc kubenswrapper[4774]: I1003 15:02:05.168814 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d56e-account-create-h6kfv" Oct 03 15:02:05 crc kubenswrapper[4774]: I1003 15:02:05.449033 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-abbe-account-create-tmxqt"] Oct 03 15:02:05 crc kubenswrapper[4774]: I1003 15:02:05.471499 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-22ec-account-create-tbdq7"] Oct 03 15:02:05 crc kubenswrapper[4774]: I1003 15:02:05.707418 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-d56e-account-create-h6kfv"] Oct 03 15:02:05 crc kubenswrapper[4774]: I1003 15:02:05.828079 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c18749aa-da1a-43fc-9275-5fee0919be21","Type":"ContainerStarted","Data":"2832168179a6a10a064322b8bb6ae24c65e74b73d03c69a2a414decd98080236"} Oct 03 15:02:05 crc kubenswrapper[4774]: I1003 15:02:05.828190 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c18749aa-da1a-43fc-9275-5fee0919be21" containerName="proxy-httpd" containerID="cri-o://2832168179a6a10a064322b8bb6ae24c65e74b73d03c69a2a414decd98080236" gracePeriod=30 Oct 03 15:02:05 crc kubenswrapper[4774]: I1003 15:02:05.828214 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c18749aa-da1a-43fc-9275-5fee0919be21" containerName="sg-core" containerID="cri-o://319e8d8031d37718f8a89f9a10ac3508af3ef3a0a102bb021cdd4e6172312a90" gracePeriod=30 Oct 03 15:02:05 crc kubenswrapper[4774]: I1003 15:02:05.828209 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c18749aa-da1a-43fc-9275-5fee0919be21" containerName="ceilometer-notification-agent" containerID="cri-o://d01b66f9a28c0a083d17709cb4b1d3cb3c00444ea223bbef77817fcd4f26c71a" gracePeriod=30 Oct 03 15:02:05 crc kubenswrapper[4774]: I1003 15:02:05.828329 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c18749aa-da1a-43fc-9275-5fee0919be21" containerName="ceilometer-central-agent" containerID="cri-o://8be38e9e06abbf97ebb3732df0e30b8546e9e8a88ff6297fe72171cba2fda635" gracePeriod=30 Oct 03 15:02:05 crc kubenswrapper[4774]: I1003 15:02:05.828474 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 15:02:05 crc kubenswrapper[4774]: I1003 15:02:05.842454 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-abbe-account-create-tmxqt" event={"ID":"55fbe210-dfe5-4488-9461-8b1f67d30f49","Type":"ContainerStarted","Data":"114b2abde0eaa8c2e671f489ff2547660d05d7223bfaa5bd2166012758f295d5"} Oct 03 15:02:05 crc kubenswrapper[4774]: I1003 15:02:05.842499 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-abbe-account-create-tmxqt" event={"ID":"55fbe210-dfe5-4488-9461-8b1f67d30f49","Type":"ContainerStarted","Data":"73430b97f23d204bd91ba46c47315141f16b2e4b57a004205d3ca6dcb42ff215"} Oct 03 15:02:05 crc kubenswrapper[4774]: I1003 15:02:05.845986 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-22ec-account-create-tbdq7" event={"ID":"bc84e6e6-c0c7-44f6-8834-0e9f5e52f98c","Type":"ContainerStarted","Data":"b521d19d25187d27ea34d826f7b27192b58a98825368c09ffc418e2a3b582745"} Oct 03 15:02:05 crc kubenswrapper[4774]: I1003 15:02:05.846038 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-22ec-account-create-tbdq7" event={"ID":"bc84e6e6-c0c7-44f6-8834-0e9f5e52f98c","Type":"ContainerStarted","Data":"a5d521b23187380b5f19d07303f0690b7b67dd62980c474dfc5cbf3c2f76086e"} Oct 03 15:02:05 crc kubenswrapper[4774]: I1003 15:02:05.847404 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d56e-account-create-h6kfv" event={"ID":"e1965846-33db-41a7-b097-d26e5a398986","Type":"ContainerStarted","Data":"762bc8c33a2f4dedbf96cedd146750be8fc3fdb9607aaf724d15763b1ac04acc"} Oct 03 15:02:05 crc kubenswrapper[4774]: I1003 15:02:05.851532 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"24f9335f-6756-4590-9e0a-6bc7bd1f4b3e","Type":"ContainerStarted","Data":"b6e4bfb643a7c611276f96241c2d64047b2cd187f3ee20adc8532f42e1ddbdd3"} Oct 03 15:02:05 crc kubenswrapper[4774]: I1003 15:02:05.859792 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.00944818 podStartE2EDuration="6.859774583s" podCreationTimestamp="2025-10-03 15:01:59 +0000 UTC" firstStartedPulling="2025-10-03 15:02:00.945602722 +0000 UTC m=+1143.534806174" lastFinishedPulling="2025-10-03 15:02:04.795929135 +0000 UTC m=+1147.385132577" observedRunningTime="2025-10-03 15:02:05.854310168 +0000 UTC m=+1148.443513610" watchObservedRunningTime="2025-10-03 15:02:05.859774583 +0000 UTC m=+1148.448978035" Oct 03 15:02:05 crc kubenswrapper[4774]: I1003 15:02:05.880186 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.880167219 podStartE2EDuration="3.880167219s" podCreationTimestamp="2025-10-03 15:02:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:02:05.876689362 +0000 UTC m=+1148.465892804" watchObservedRunningTime="2025-10-03 15:02:05.880167219 +0000 UTC m=+1148.469370671" Oct 03 15:02:05 crc kubenswrapper[4774]: I1003 15:02:05.897416 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-22ec-account-create-tbdq7" podStartSLOduration=1.897399306 podStartE2EDuration="1.897399306s" podCreationTimestamp="2025-10-03 15:02:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:02:05.893561831 +0000 UTC m=+1148.482765283" watchObservedRunningTime="2025-10-03 15:02:05.897399306 +0000 UTC m=+1148.486602758" Oct 03 15:02:05 crc kubenswrapper[4774]: I1003 15:02:05.910442 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-abbe-account-create-tmxqt" podStartSLOduration=1.9104266189999999 podStartE2EDuration="1.910426619s" podCreationTimestamp="2025-10-03 15:02:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:02:05.908245665 +0000 UTC m=+1148.497449117" watchObservedRunningTime="2025-10-03 15:02:05.910426619 +0000 UTC m=+1148.499630071" Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.717250 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.782752 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c18749aa-da1a-43fc-9275-5fee0919be21-config-data\") pod \"c18749aa-da1a-43fc-9275-5fee0919be21\" (UID: \"c18749aa-da1a-43fc-9275-5fee0919be21\") " Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.782798 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c18749aa-da1a-43fc-9275-5fee0919be21-combined-ca-bundle\") pod \"c18749aa-da1a-43fc-9275-5fee0919be21\" (UID: \"c18749aa-da1a-43fc-9275-5fee0919be21\") " Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.782874 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c18749aa-da1a-43fc-9275-5fee0919be21-scripts\") pod \"c18749aa-da1a-43fc-9275-5fee0919be21\" (UID: \"c18749aa-da1a-43fc-9275-5fee0919be21\") " Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.782944 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47nq8\" (UniqueName: \"kubernetes.io/projected/c18749aa-da1a-43fc-9275-5fee0919be21-kube-api-access-47nq8\") pod \"c18749aa-da1a-43fc-9275-5fee0919be21\" (UID: \"c18749aa-da1a-43fc-9275-5fee0919be21\") " Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.784340 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c18749aa-da1a-43fc-9275-5fee0919be21-run-httpd\") pod \"c18749aa-da1a-43fc-9275-5fee0919be21\" (UID: \"c18749aa-da1a-43fc-9275-5fee0919be21\") " Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.784402 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c18749aa-da1a-43fc-9275-5fee0919be21-sg-core-conf-yaml\") pod \"c18749aa-da1a-43fc-9275-5fee0919be21\" (UID: \"c18749aa-da1a-43fc-9275-5fee0919be21\") " Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.784432 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c18749aa-da1a-43fc-9275-5fee0919be21-log-httpd\") pod \"c18749aa-da1a-43fc-9275-5fee0919be21\" (UID: \"c18749aa-da1a-43fc-9275-5fee0919be21\") " Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.784755 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c18749aa-da1a-43fc-9275-5fee0919be21-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c18749aa-da1a-43fc-9275-5fee0919be21" (UID: "c18749aa-da1a-43fc-9275-5fee0919be21"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.785155 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c18749aa-da1a-43fc-9275-5fee0919be21-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c18749aa-da1a-43fc-9275-5fee0919be21" (UID: "c18749aa-da1a-43fc-9275-5fee0919be21"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.785697 4774 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c18749aa-da1a-43fc-9275-5fee0919be21-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.785738 4774 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c18749aa-da1a-43fc-9275-5fee0919be21-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.806774 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c18749aa-da1a-43fc-9275-5fee0919be21-kube-api-access-47nq8" (OuterVolumeSpecName: "kube-api-access-47nq8") pod "c18749aa-da1a-43fc-9275-5fee0919be21" (UID: "c18749aa-da1a-43fc-9275-5fee0919be21"). InnerVolumeSpecName "kube-api-access-47nq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.806844 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c18749aa-da1a-43fc-9275-5fee0919be21-scripts" (OuterVolumeSpecName: "scripts") pod "c18749aa-da1a-43fc-9275-5fee0919be21" (UID: "c18749aa-da1a-43fc-9275-5fee0919be21"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.824340 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c18749aa-da1a-43fc-9275-5fee0919be21-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c18749aa-da1a-43fc-9275-5fee0919be21" (UID: "c18749aa-da1a-43fc-9275-5fee0919be21"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.864758 4774 generic.go:334] "Generic (PLEG): container finished" podID="55fbe210-dfe5-4488-9461-8b1f67d30f49" containerID="114b2abde0eaa8c2e671f489ff2547660d05d7223bfaa5bd2166012758f295d5" exitCode=0 Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.864844 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-abbe-account-create-tmxqt" event={"ID":"55fbe210-dfe5-4488-9461-8b1f67d30f49","Type":"ContainerDied","Data":"114b2abde0eaa8c2e671f489ff2547660d05d7223bfaa5bd2166012758f295d5"} Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.866559 4774 generic.go:334] "Generic (PLEG): container finished" podID="bc84e6e6-c0c7-44f6-8834-0e9f5e52f98c" containerID="b521d19d25187d27ea34d826f7b27192b58a98825368c09ffc418e2a3b582745" exitCode=0 Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.866699 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-22ec-account-create-tbdq7" event={"ID":"bc84e6e6-c0c7-44f6-8834-0e9f5e52f98c","Type":"ContainerDied","Data":"b521d19d25187d27ea34d826f7b27192b58a98825368c09ffc418e2a3b582745"} Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.868811 4774 generic.go:334] "Generic (PLEG): container finished" podID="e1965846-33db-41a7-b097-d26e5a398986" containerID="a1756df0b2652de10e71806e9f194ae59a9fb4519adeccb603d2ae3212145a32" exitCode=0 Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.868855 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d56e-account-create-h6kfv" event={"ID":"e1965846-33db-41a7-b097-d26e5a398986","Type":"ContainerDied","Data":"a1756df0b2652de10e71806e9f194ae59a9fb4519adeccb603d2ae3212145a32"} Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.871740 4774 generic.go:334] "Generic (PLEG): container finished" podID="c18749aa-da1a-43fc-9275-5fee0919be21" containerID="2832168179a6a10a064322b8bb6ae24c65e74b73d03c69a2a414decd98080236" exitCode=0 Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.871772 4774 generic.go:334] "Generic (PLEG): container finished" podID="c18749aa-da1a-43fc-9275-5fee0919be21" containerID="319e8d8031d37718f8a89f9a10ac3508af3ef3a0a102bb021cdd4e6172312a90" exitCode=2 Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.871782 4774 generic.go:334] "Generic (PLEG): container finished" podID="c18749aa-da1a-43fc-9275-5fee0919be21" containerID="d01b66f9a28c0a083d17709cb4b1d3cb3c00444ea223bbef77817fcd4f26c71a" exitCode=0 Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.871795 4774 generic.go:334] "Generic (PLEG): container finished" podID="c18749aa-da1a-43fc-9275-5fee0919be21" containerID="8be38e9e06abbf97ebb3732df0e30b8546e9e8a88ff6297fe72171cba2fda635" exitCode=0 Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.872840 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.873069 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c18749aa-da1a-43fc-9275-5fee0919be21","Type":"ContainerDied","Data":"2832168179a6a10a064322b8bb6ae24c65e74b73d03c69a2a414decd98080236"} Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.873103 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c18749aa-da1a-43fc-9275-5fee0919be21","Type":"ContainerDied","Data":"319e8d8031d37718f8a89f9a10ac3508af3ef3a0a102bb021cdd4e6172312a90"} Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.873118 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c18749aa-da1a-43fc-9275-5fee0919be21","Type":"ContainerDied","Data":"d01b66f9a28c0a083d17709cb4b1d3cb3c00444ea223bbef77817fcd4f26c71a"} Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.873130 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c18749aa-da1a-43fc-9275-5fee0919be21","Type":"ContainerDied","Data":"8be38e9e06abbf97ebb3732df0e30b8546e9e8a88ff6297fe72171cba2fda635"} Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.873140 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c18749aa-da1a-43fc-9275-5fee0919be21","Type":"ContainerDied","Data":"b59964bf9a5519a266f5267b6662dd42461eb7ddd85043e6846b2ba045d54caf"} Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.873160 4774 scope.go:117] "RemoveContainer" containerID="2832168179a6a10a064322b8bb6ae24c65e74b73d03c69a2a414decd98080236" Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.888116 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c18749aa-da1a-43fc-9275-5fee0919be21-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.888144 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47nq8\" (UniqueName: \"kubernetes.io/projected/c18749aa-da1a-43fc-9275-5fee0919be21-kube-api-access-47nq8\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.888154 4774 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c18749aa-da1a-43fc-9275-5fee0919be21-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.898407 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c18749aa-da1a-43fc-9275-5fee0919be21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c18749aa-da1a-43fc-9275-5fee0919be21" (UID: "c18749aa-da1a-43fc-9275-5fee0919be21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.921241 4774 scope.go:117] "RemoveContainer" containerID="319e8d8031d37718f8a89f9a10ac3508af3ef3a0a102bb021cdd4e6172312a90" Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.941586 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c18749aa-da1a-43fc-9275-5fee0919be21-config-data" (OuterVolumeSpecName: "config-data") pod "c18749aa-da1a-43fc-9275-5fee0919be21" (UID: "c18749aa-da1a-43fc-9275-5fee0919be21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.952241 4774 scope.go:117] "RemoveContainer" containerID="d01b66f9a28c0a083d17709cb4b1d3cb3c00444ea223bbef77817fcd4f26c71a" Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.982160 4774 scope.go:117] "RemoveContainer" containerID="8be38e9e06abbf97ebb3732df0e30b8546e9e8a88ff6297fe72171cba2fda635" Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.990587 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c18749aa-da1a-43fc-9275-5fee0919be21-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:06 crc kubenswrapper[4774]: I1003 15:02:06.990622 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c18749aa-da1a-43fc-9275-5fee0919be21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.010185 4774 scope.go:117] "RemoveContainer" containerID="2832168179a6a10a064322b8bb6ae24c65e74b73d03c69a2a414decd98080236" Oct 03 15:02:07 crc kubenswrapper[4774]: E1003 15:02:07.011066 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2832168179a6a10a064322b8bb6ae24c65e74b73d03c69a2a414decd98080236\": container with ID starting with 2832168179a6a10a064322b8bb6ae24c65e74b73d03c69a2a414decd98080236 not found: ID does not exist" containerID="2832168179a6a10a064322b8bb6ae24c65e74b73d03c69a2a414decd98080236" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.011308 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2832168179a6a10a064322b8bb6ae24c65e74b73d03c69a2a414decd98080236"} err="failed to get container status \"2832168179a6a10a064322b8bb6ae24c65e74b73d03c69a2a414decd98080236\": rpc error: code = NotFound desc = could not find container \"2832168179a6a10a064322b8bb6ae24c65e74b73d03c69a2a414decd98080236\": container with ID starting with 2832168179a6a10a064322b8bb6ae24c65e74b73d03c69a2a414decd98080236 not found: ID does not exist" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.011342 4774 scope.go:117] "RemoveContainer" containerID="319e8d8031d37718f8a89f9a10ac3508af3ef3a0a102bb021cdd4e6172312a90" Oct 03 15:02:07 crc kubenswrapper[4774]: E1003 15:02:07.011864 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"319e8d8031d37718f8a89f9a10ac3508af3ef3a0a102bb021cdd4e6172312a90\": container with ID starting with 319e8d8031d37718f8a89f9a10ac3508af3ef3a0a102bb021cdd4e6172312a90 not found: ID does not exist" containerID="319e8d8031d37718f8a89f9a10ac3508af3ef3a0a102bb021cdd4e6172312a90" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.011914 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"319e8d8031d37718f8a89f9a10ac3508af3ef3a0a102bb021cdd4e6172312a90"} err="failed to get container status \"319e8d8031d37718f8a89f9a10ac3508af3ef3a0a102bb021cdd4e6172312a90\": rpc error: code = NotFound desc = could not find container \"319e8d8031d37718f8a89f9a10ac3508af3ef3a0a102bb021cdd4e6172312a90\": container with ID starting with 319e8d8031d37718f8a89f9a10ac3508af3ef3a0a102bb021cdd4e6172312a90 not found: ID does not exist" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.011937 4774 scope.go:117] "RemoveContainer" containerID="d01b66f9a28c0a083d17709cb4b1d3cb3c00444ea223bbef77817fcd4f26c71a" Oct 03 15:02:07 crc kubenswrapper[4774]: E1003 15:02:07.012241 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d01b66f9a28c0a083d17709cb4b1d3cb3c00444ea223bbef77817fcd4f26c71a\": container with ID starting with d01b66f9a28c0a083d17709cb4b1d3cb3c00444ea223bbef77817fcd4f26c71a not found: ID does not exist" containerID="d01b66f9a28c0a083d17709cb4b1d3cb3c00444ea223bbef77817fcd4f26c71a" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.012286 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d01b66f9a28c0a083d17709cb4b1d3cb3c00444ea223bbef77817fcd4f26c71a"} err="failed to get container status \"d01b66f9a28c0a083d17709cb4b1d3cb3c00444ea223bbef77817fcd4f26c71a\": rpc error: code = NotFound desc = could not find container \"d01b66f9a28c0a083d17709cb4b1d3cb3c00444ea223bbef77817fcd4f26c71a\": container with ID starting with d01b66f9a28c0a083d17709cb4b1d3cb3c00444ea223bbef77817fcd4f26c71a not found: ID does not exist" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.012317 4774 scope.go:117] "RemoveContainer" containerID="8be38e9e06abbf97ebb3732df0e30b8546e9e8a88ff6297fe72171cba2fda635" Oct 03 15:02:07 crc kubenswrapper[4774]: E1003 15:02:07.012690 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8be38e9e06abbf97ebb3732df0e30b8546e9e8a88ff6297fe72171cba2fda635\": container with ID starting with 8be38e9e06abbf97ebb3732df0e30b8546e9e8a88ff6297fe72171cba2fda635 not found: ID does not exist" containerID="8be38e9e06abbf97ebb3732df0e30b8546e9e8a88ff6297fe72171cba2fda635" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.012733 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8be38e9e06abbf97ebb3732df0e30b8546e9e8a88ff6297fe72171cba2fda635"} err="failed to get container status \"8be38e9e06abbf97ebb3732df0e30b8546e9e8a88ff6297fe72171cba2fda635\": rpc error: code = NotFound desc = could not find container \"8be38e9e06abbf97ebb3732df0e30b8546e9e8a88ff6297fe72171cba2fda635\": container with ID starting with 8be38e9e06abbf97ebb3732df0e30b8546e9e8a88ff6297fe72171cba2fda635 not found: ID does not exist" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.012754 4774 scope.go:117] "RemoveContainer" containerID="2832168179a6a10a064322b8bb6ae24c65e74b73d03c69a2a414decd98080236" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.012976 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2832168179a6a10a064322b8bb6ae24c65e74b73d03c69a2a414decd98080236"} err="failed to get container status \"2832168179a6a10a064322b8bb6ae24c65e74b73d03c69a2a414decd98080236\": rpc error: code = NotFound desc = could not find container \"2832168179a6a10a064322b8bb6ae24c65e74b73d03c69a2a414decd98080236\": container with ID starting with 2832168179a6a10a064322b8bb6ae24c65e74b73d03c69a2a414decd98080236 not found: ID does not exist" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.012999 4774 scope.go:117] "RemoveContainer" containerID="319e8d8031d37718f8a89f9a10ac3508af3ef3a0a102bb021cdd4e6172312a90" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.013206 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"319e8d8031d37718f8a89f9a10ac3508af3ef3a0a102bb021cdd4e6172312a90"} err="failed to get container status \"319e8d8031d37718f8a89f9a10ac3508af3ef3a0a102bb021cdd4e6172312a90\": rpc error: code = NotFound desc = could not find container \"319e8d8031d37718f8a89f9a10ac3508af3ef3a0a102bb021cdd4e6172312a90\": container with ID starting with 319e8d8031d37718f8a89f9a10ac3508af3ef3a0a102bb021cdd4e6172312a90 not found: ID does not exist" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.013232 4774 scope.go:117] "RemoveContainer" containerID="d01b66f9a28c0a083d17709cb4b1d3cb3c00444ea223bbef77817fcd4f26c71a" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.013446 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d01b66f9a28c0a083d17709cb4b1d3cb3c00444ea223bbef77817fcd4f26c71a"} err="failed to get container status \"d01b66f9a28c0a083d17709cb4b1d3cb3c00444ea223bbef77817fcd4f26c71a\": rpc error: code = NotFound desc = could not find container \"d01b66f9a28c0a083d17709cb4b1d3cb3c00444ea223bbef77817fcd4f26c71a\": container with ID starting with d01b66f9a28c0a083d17709cb4b1d3cb3c00444ea223bbef77817fcd4f26c71a not found: ID does not exist" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.013473 4774 scope.go:117] "RemoveContainer" containerID="8be38e9e06abbf97ebb3732df0e30b8546e9e8a88ff6297fe72171cba2fda635" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.013720 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8be38e9e06abbf97ebb3732df0e30b8546e9e8a88ff6297fe72171cba2fda635"} err="failed to get container status \"8be38e9e06abbf97ebb3732df0e30b8546e9e8a88ff6297fe72171cba2fda635\": rpc error: code = NotFound desc = could not find container \"8be38e9e06abbf97ebb3732df0e30b8546e9e8a88ff6297fe72171cba2fda635\": container with ID starting with 8be38e9e06abbf97ebb3732df0e30b8546e9e8a88ff6297fe72171cba2fda635 not found: ID does not exist" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.013746 4774 scope.go:117] "RemoveContainer" containerID="2832168179a6a10a064322b8bb6ae24c65e74b73d03c69a2a414decd98080236" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.013995 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2832168179a6a10a064322b8bb6ae24c65e74b73d03c69a2a414decd98080236"} err="failed to get container status \"2832168179a6a10a064322b8bb6ae24c65e74b73d03c69a2a414decd98080236\": rpc error: code = NotFound desc = could not find container \"2832168179a6a10a064322b8bb6ae24c65e74b73d03c69a2a414decd98080236\": container with ID starting with 2832168179a6a10a064322b8bb6ae24c65e74b73d03c69a2a414decd98080236 not found: ID does not exist" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.014015 4774 scope.go:117] "RemoveContainer" containerID="319e8d8031d37718f8a89f9a10ac3508af3ef3a0a102bb021cdd4e6172312a90" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.014513 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"319e8d8031d37718f8a89f9a10ac3508af3ef3a0a102bb021cdd4e6172312a90"} err="failed to get container status \"319e8d8031d37718f8a89f9a10ac3508af3ef3a0a102bb021cdd4e6172312a90\": rpc error: code = NotFound desc = could not find container \"319e8d8031d37718f8a89f9a10ac3508af3ef3a0a102bb021cdd4e6172312a90\": container with ID starting with 319e8d8031d37718f8a89f9a10ac3508af3ef3a0a102bb021cdd4e6172312a90 not found: ID does not exist" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.014536 4774 scope.go:117] "RemoveContainer" containerID="d01b66f9a28c0a083d17709cb4b1d3cb3c00444ea223bbef77817fcd4f26c71a" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.014930 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d01b66f9a28c0a083d17709cb4b1d3cb3c00444ea223bbef77817fcd4f26c71a"} err="failed to get container status \"d01b66f9a28c0a083d17709cb4b1d3cb3c00444ea223bbef77817fcd4f26c71a\": rpc error: code = NotFound desc = could not find container \"d01b66f9a28c0a083d17709cb4b1d3cb3c00444ea223bbef77817fcd4f26c71a\": container with ID starting with d01b66f9a28c0a083d17709cb4b1d3cb3c00444ea223bbef77817fcd4f26c71a not found: ID does not exist" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.014950 4774 scope.go:117] "RemoveContainer" containerID="8be38e9e06abbf97ebb3732df0e30b8546e9e8a88ff6297fe72171cba2fda635" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.015261 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8be38e9e06abbf97ebb3732df0e30b8546e9e8a88ff6297fe72171cba2fda635"} err="failed to get container status \"8be38e9e06abbf97ebb3732df0e30b8546e9e8a88ff6297fe72171cba2fda635\": rpc error: code = NotFound desc = could not find container \"8be38e9e06abbf97ebb3732df0e30b8546e9e8a88ff6297fe72171cba2fda635\": container with ID starting with 8be38e9e06abbf97ebb3732df0e30b8546e9e8a88ff6297fe72171cba2fda635 not found: ID does not exist" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.015282 4774 scope.go:117] "RemoveContainer" containerID="2832168179a6a10a064322b8bb6ae24c65e74b73d03c69a2a414decd98080236" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.015572 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2832168179a6a10a064322b8bb6ae24c65e74b73d03c69a2a414decd98080236"} err="failed to get container status \"2832168179a6a10a064322b8bb6ae24c65e74b73d03c69a2a414decd98080236\": rpc error: code = NotFound desc = could not find container \"2832168179a6a10a064322b8bb6ae24c65e74b73d03c69a2a414decd98080236\": container with ID starting with 2832168179a6a10a064322b8bb6ae24c65e74b73d03c69a2a414decd98080236 not found: ID does not exist" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.015604 4774 scope.go:117] "RemoveContainer" containerID="319e8d8031d37718f8a89f9a10ac3508af3ef3a0a102bb021cdd4e6172312a90" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.015954 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"319e8d8031d37718f8a89f9a10ac3508af3ef3a0a102bb021cdd4e6172312a90"} err="failed to get container status \"319e8d8031d37718f8a89f9a10ac3508af3ef3a0a102bb021cdd4e6172312a90\": rpc error: code = NotFound desc = could not find container \"319e8d8031d37718f8a89f9a10ac3508af3ef3a0a102bb021cdd4e6172312a90\": container with ID starting with 319e8d8031d37718f8a89f9a10ac3508af3ef3a0a102bb021cdd4e6172312a90 not found: ID does not exist" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.015981 4774 scope.go:117] "RemoveContainer" containerID="d01b66f9a28c0a083d17709cb4b1d3cb3c00444ea223bbef77817fcd4f26c71a" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.016211 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d01b66f9a28c0a083d17709cb4b1d3cb3c00444ea223bbef77817fcd4f26c71a"} err="failed to get container status \"d01b66f9a28c0a083d17709cb4b1d3cb3c00444ea223bbef77817fcd4f26c71a\": rpc error: code = NotFound desc = could not find container \"d01b66f9a28c0a083d17709cb4b1d3cb3c00444ea223bbef77817fcd4f26c71a\": container with ID starting with d01b66f9a28c0a083d17709cb4b1d3cb3c00444ea223bbef77817fcd4f26c71a not found: ID does not exist" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.016233 4774 scope.go:117] "RemoveContainer" containerID="8be38e9e06abbf97ebb3732df0e30b8546e9e8a88ff6297fe72171cba2fda635" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.016501 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8be38e9e06abbf97ebb3732df0e30b8546e9e8a88ff6297fe72171cba2fda635"} err="failed to get container status \"8be38e9e06abbf97ebb3732df0e30b8546e9e8a88ff6297fe72171cba2fda635\": rpc error: code = NotFound desc = could not find container \"8be38e9e06abbf97ebb3732df0e30b8546e9e8a88ff6297fe72171cba2fda635\": container with ID starting with 8be38e9e06abbf97ebb3732df0e30b8546e9e8a88ff6297fe72171cba2fda635 not found: ID does not exist" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.211722 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.221084 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.241212 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:02:07 crc kubenswrapper[4774]: E1003 15:02:07.242898 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c18749aa-da1a-43fc-9275-5fee0919be21" containerName="sg-core" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.242927 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="c18749aa-da1a-43fc-9275-5fee0919be21" containerName="sg-core" Oct 03 15:02:07 crc kubenswrapper[4774]: E1003 15:02:07.242953 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c18749aa-da1a-43fc-9275-5fee0919be21" containerName="ceilometer-notification-agent" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.242961 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="c18749aa-da1a-43fc-9275-5fee0919be21" containerName="ceilometer-notification-agent" Oct 03 15:02:07 crc kubenswrapper[4774]: E1003 15:02:07.242979 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c18749aa-da1a-43fc-9275-5fee0919be21" containerName="proxy-httpd" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.242986 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="c18749aa-da1a-43fc-9275-5fee0919be21" containerName="proxy-httpd" Oct 03 15:02:07 crc kubenswrapper[4774]: E1003 15:02:07.242998 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c18749aa-da1a-43fc-9275-5fee0919be21" containerName="ceilometer-central-agent" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.243004 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="c18749aa-da1a-43fc-9275-5fee0919be21" containerName="ceilometer-central-agent" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.243209 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="c18749aa-da1a-43fc-9275-5fee0919be21" containerName="ceilometer-central-agent" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.243235 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="c18749aa-da1a-43fc-9275-5fee0919be21" containerName="sg-core" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.243244 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="c18749aa-da1a-43fc-9275-5fee0919be21" containerName="proxy-httpd" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.243253 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="c18749aa-da1a-43fc-9275-5fee0919be21" containerName="ceilometer-notification-agent" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.245236 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.249462 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.249811 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.268752 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.295034 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82dca456-3f74-4976-a252-27f68a189944-log-httpd\") pod \"ceilometer-0\" (UID: \"82dca456-3f74-4976-a252-27f68a189944\") " pod="openstack/ceilometer-0" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.295085 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82dca456-3f74-4976-a252-27f68a189944-run-httpd\") pod \"ceilometer-0\" (UID: \"82dca456-3f74-4976-a252-27f68a189944\") " pod="openstack/ceilometer-0" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.295108 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gsz5\" (UniqueName: \"kubernetes.io/projected/82dca456-3f74-4976-a252-27f68a189944-kube-api-access-8gsz5\") pod \"ceilometer-0\" (UID: \"82dca456-3f74-4976-a252-27f68a189944\") " pod="openstack/ceilometer-0" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.295140 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82dca456-3f74-4976-a252-27f68a189944-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"82dca456-3f74-4976-a252-27f68a189944\") " pod="openstack/ceilometer-0" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.295202 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82dca456-3f74-4976-a252-27f68a189944-scripts\") pod \"ceilometer-0\" (UID: \"82dca456-3f74-4976-a252-27f68a189944\") " pod="openstack/ceilometer-0" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.295233 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82dca456-3f74-4976-a252-27f68a189944-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"82dca456-3f74-4976-a252-27f68a189944\") " pod="openstack/ceilometer-0" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.295250 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82dca456-3f74-4976-a252-27f68a189944-config-data\") pod \"ceilometer-0\" (UID: \"82dca456-3f74-4976-a252-27f68a189944\") " pod="openstack/ceilometer-0" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.337942 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c18749aa-da1a-43fc-9275-5fee0919be21" path="/var/lib/kubelet/pods/c18749aa-da1a-43fc-9275-5fee0919be21/volumes" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.397366 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82dca456-3f74-4976-a252-27f68a189944-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"82dca456-3f74-4976-a252-27f68a189944\") " pod="openstack/ceilometer-0" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.397585 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82dca456-3f74-4976-a252-27f68a189944-scripts\") pod \"ceilometer-0\" (UID: \"82dca456-3f74-4976-a252-27f68a189944\") " pod="openstack/ceilometer-0" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.397642 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82dca456-3f74-4976-a252-27f68a189944-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"82dca456-3f74-4976-a252-27f68a189944\") " pod="openstack/ceilometer-0" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.397666 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82dca456-3f74-4976-a252-27f68a189944-config-data\") pod \"ceilometer-0\" (UID: \"82dca456-3f74-4976-a252-27f68a189944\") " pod="openstack/ceilometer-0" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.397767 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82dca456-3f74-4976-a252-27f68a189944-log-httpd\") pod \"ceilometer-0\" (UID: \"82dca456-3f74-4976-a252-27f68a189944\") " pod="openstack/ceilometer-0" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.397814 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82dca456-3f74-4976-a252-27f68a189944-run-httpd\") pod \"ceilometer-0\" (UID: \"82dca456-3f74-4976-a252-27f68a189944\") " pod="openstack/ceilometer-0" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.397843 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gsz5\" (UniqueName: \"kubernetes.io/projected/82dca456-3f74-4976-a252-27f68a189944-kube-api-access-8gsz5\") pod \"ceilometer-0\" (UID: \"82dca456-3f74-4976-a252-27f68a189944\") " pod="openstack/ceilometer-0" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.399229 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82dca456-3f74-4976-a252-27f68a189944-log-httpd\") pod \"ceilometer-0\" (UID: \"82dca456-3f74-4976-a252-27f68a189944\") " pod="openstack/ceilometer-0" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.399730 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82dca456-3f74-4976-a252-27f68a189944-run-httpd\") pod \"ceilometer-0\" (UID: \"82dca456-3f74-4976-a252-27f68a189944\") " pod="openstack/ceilometer-0" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.401507 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82dca456-3f74-4976-a252-27f68a189944-config-data\") pod \"ceilometer-0\" (UID: \"82dca456-3f74-4976-a252-27f68a189944\") " pod="openstack/ceilometer-0" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.401899 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82dca456-3f74-4976-a252-27f68a189944-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"82dca456-3f74-4976-a252-27f68a189944\") " pod="openstack/ceilometer-0" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.403308 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82dca456-3f74-4976-a252-27f68a189944-scripts\") pod \"ceilometer-0\" (UID: \"82dca456-3f74-4976-a252-27f68a189944\") " pod="openstack/ceilometer-0" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.413288 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82dca456-3f74-4976-a252-27f68a189944-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"82dca456-3f74-4976-a252-27f68a189944\") " pod="openstack/ceilometer-0" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.415564 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gsz5\" (UniqueName: \"kubernetes.io/projected/82dca456-3f74-4976-a252-27f68a189944-kube-api-access-8gsz5\") pod \"ceilometer-0\" (UID: \"82dca456-3f74-4976-a252-27f68a189944\") " pod="openstack/ceilometer-0" Oct 03 15:02:07 crc kubenswrapper[4774]: I1003 15:02:07.561638 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 15:02:08 crc kubenswrapper[4774]: I1003 15:02:08.004544 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:02:08 crc kubenswrapper[4774]: W1003 15:02:08.028966 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82dca456_3f74_4976_a252_27f68a189944.slice/crio-c4667b62392f49fd286fc326239862db0ded73b7efc60281e791f6bf53b33f02 WatchSource:0}: Error finding container c4667b62392f49fd286fc326239862db0ded73b7efc60281e791f6bf53b33f02: Status 404 returned error can't find the container with id c4667b62392f49fd286fc326239862db0ded73b7efc60281e791f6bf53b33f02 Oct 03 15:02:08 crc kubenswrapper[4774]: I1003 15:02:08.162862 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-22ec-account-create-tbdq7" Oct 03 15:02:08 crc kubenswrapper[4774]: I1003 15:02:08.215127 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2fzk\" (UniqueName: \"kubernetes.io/projected/bc84e6e6-c0c7-44f6-8834-0e9f5e52f98c-kube-api-access-x2fzk\") pod \"bc84e6e6-c0c7-44f6-8834-0e9f5e52f98c\" (UID: \"bc84e6e6-c0c7-44f6-8834-0e9f5e52f98c\") " Oct 03 15:02:08 crc kubenswrapper[4774]: I1003 15:02:08.228240 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc84e6e6-c0c7-44f6-8834-0e9f5e52f98c-kube-api-access-x2fzk" (OuterVolumeSpecName: "kube-api-access-x2fzk") pod "bc84e6e6-c0c7-44f6-8834-0e9f5e52f98c" (UID: "bc84e6e6-c0c7-44f6-8834-0e9f5e52f98c"). InnerVolumeSpecName "kube-api-access-x2fzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:02:08 crc kubenswrapper[4774]: I1003 15:02:08.247966 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d56e-account-create-h6kfv" Oct 03 15:02:08 crc kubenswrapper[4774]: I1003 15:02:08.317393 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrg2j\" (UniqueName: \"kubernetes.io/projected/e1965846-33db-41a7-b097-d26e5a398986-kube-api-access-hrg2j\") pod \"e1965846-33db-41a7-b097-d26e5a398986\" (UID: \"e1965846-33db-41a7-b097-d26e5a398986\") " Oct 03 15:02:08 crc kubenswrapper[4774]: I1003 15:02:08.317932 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2fzk\" (UniqueName: \"kubernetes.io/projected/bc84e6e6-c0c7-44f6-8834-0e9f5e52f98c-kube-api-access-x2fzk\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:08 crc kubenswrapper[4774]: I1003 15:02:08.349060 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1965846-33db-41a7-b097-d26e5a398986-kube-api-access-hrg2j" (OuterVolumeSpecName: "kube-api-access-hrg2j") pod "e1965846-33db-41a7-b097-d26e5a398986" (UID: "e1965846-33db-41a7-b097-d26e5a398986"). InnerVolumeSpecName "kube-api-access-hrg2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:02:08 crc kubenswrapper[4774]: I1003 15:02:08.351014 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:02:08 crc kubenswrapper[4774]: I1003 15:02:08.406497 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 03 15:02:08 crc kubenswrapper[4774]: I1003 15:02:08.422363 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrg2j\" (UniqueName: \"kubernetes.io/projected/e1965846-33db-41a7-b097-d26e5a398986-kube-api-access-hrg2j\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:08 crc kubenswrapper[4774]: I1003 15:02:08.611306 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-77bb8d5544-lc44r" Oct 03 15:02:08 crc kubenswrapper[4774]: I1003 15:02:08.655660 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-abbe-account-create-tmxqt" Oct 03 15:02:08 crc kubenswrapper[4774]: I1003 15:02:08.728521 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ns2kb\" (UniqueName: \"kubernetes.io/projected/55fbe210-dfe5-4488-9461-8b1f67d30f49-kube-api-access-ns2kb\") pod \"55fbe210-dfe5-4488-9461-8b1f67d30f49\" (UID: \"55fbe210-dfe5-4488-9461-8b1f67d30f49\") " Oct 03 15:02:08 crc kubenswrapper[4774]: I1003 15:02:08.743511 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55fbe210-dfe5-4488-9461-8b1f67d30f49-kube-api-access-ns2kb" (OuterVolumeSpecName: "kube-api-access-ns2kb") pod "55fbe210-dfe5-4488-9461-8b1f67d30f49" (UID: "55fbe210-dfe5-4488-9461-8b1f67d30f49"). InnerVolumeSpecName "kube-api-access-ns2kb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:02:08 crc kubenswrapper[4774]: I1003 15:02:08.832557 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ns2kb\" (UniqueName: \"kubernetes.io/projected/55fbe210-dfe5-4488-9461-8b1f67d30f49-kube-api-access-ns2kb\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:08 crc kubenswrapper[4774]: I1003 15:02:08.904769 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82dca456-3f74-4976-a252-27f68a189944","Type":"ContainerStarted","Data":"d47b7a33d34e46154ecd5a37b959bca07fcc8346b172b26189a6c5515a503cef"} Oct 03 15:02:08 crc kubenswrapper[4774]: I1003 15:02:08.904839 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82dca456-3f74-4976-a252-27f68a189944","Type":"ContainerStarted","Data":"c4667b62392f49fd286fc326239862db0ded73b7efc60281e791f6bf53b33f02"} Oct 03 15:02:08 crc kubenswrapper[4774]: I1003 15:02:08.910004 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-abbe-account-create-tmxqt" event={"ID":"55fbe210-dfe5-4488-9461-8b1f67d30f49","Type":"ContainerDied","Data":"73430b97f23d204bd91ba46c47315141f16b2e4b57a004205d3ca6dcb42ff215"} Oct 03 15:02:08 crc kubenswrapper[4774]: I1003 15:02:08.910020 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-abbe-account-create-tmxqt" Oct 03 15:02:08 crc kubenswrapper[4774]: I1003 15:02:08.910037 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73430b97f23d204bd91ba46c47315141f16b2e4b57a004205d3ca6dcb42ff215" Oct 03 15:02:08 crc kubenswrapper[4774]: I1003 15:02:08.911805 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-22ec-account-create-tbdq7" Oct 03 15:02:08 crc kubenswrapper[4774]: I1003 15:02:08.911796 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-22ec-account-create-tbdq7" event={"ID":"bc84e6e6-c0c7-44f6-8834-0e9f5e52f98c","Type":"ContainerDied","Data":"a5d521b23187380b5f19d07303f0690b7b67dd62980c474dfc5cbf3c2f76086e"} Oct 03 15:02:08 crc kubenswrapper[4774]: I1003 15:02:08.911906 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5d521b23187380b5f19d07303f0690b7b67dd62980c474dfc5cbf3c2f76086e" Oct 03 15:02:08 crc kubenswrapper[4774]: I1003 15:02:08.921200 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d56e-account-create-h6kfv" event={"ID":"e1965846-33db-41a7-b097-d26e5a398986","Type":"ContainerDied","Data":"762bc8c33a2f4dedbf96cedd146750be8fc3fdb9607aaf724d15763b1ac04acc"} Oct 03 15:02:08 crc kubenswrapper[4774]: I1003 15:02:08.921241 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="762bc8c33a2f4dedbf96cedd146750be8fc3fdb9607aaf724d15763b1ac04acc" Oct 03 15:02:08 crc kubenswrapper[4774]: I1003 15:02:08.921264 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d56e-account-create-h6kfv" Oct 03 15:02:09 crc kubenswrapper[4774]: I1003 15:02:09.930644 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82dca456-3f74-4976-a252-27f68a189944","Type":"ContainerStarted","Data":"60c99f4078afa883ce346bae835c73380aec43d7892ce5cc2c504d2aa6836694"} Oct 03 15:02:10 crc kubenswrapper[4774]: I1003 15:02:10.185046 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 15:02:10 crc kubenswrapper[4774]: I1003 15:02:10.185097 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 15:02:10 crc kubenswrapper[4774]: I1003 15:02:10.223604 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 15:02:10 crc kubenswrapper[4774]: I1003 15:02:10.233523 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 15:02:10 crc kubenswrapper[4774]: I1003 15:02:10.942162 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82dca456-3f74-4976-a252-27f68a189944","Type":"ContainerStarted","Data":"f8e41ac24f1f00f6478f75e26305ea4a0e9c3ba8f26901db674b2fddf85127bd"} Oct 03 15:02:10 crc kubenswrapper[4774]: I1003 15:02:10.942629 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 15:02:10 crc kubenswrapper[4774]: I1003 15:02:10.942664 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 15:02:11 crc kubenswrapper[4774]: I1003 15:02:11.227755 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-987467b4f-dts4l" Oct 03 15:02:11 crc kubenswrapper[4774]: I1003 15:02:11.296833 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-77bb8d5544-lc44r"] Oct 03 15:02:11 crc kubenswrapper[4774]: I1003 15:02:11.300049 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-77bb8d5544-lc44r" podUID="6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c" containerName="neutron-api" containerID="cri-o://6abece3013abb91833b7834b0416ec0d019a4edbf03c419fb55d137beffaa70a" gracePeriod=30 Oct 03 15:02:11 crc kubenswrapper[4774]: I1003 15:02:11.300528 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-77bb8d5544-lc44r" podUID="6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c" containerName="neutron-httpd" containerID="cri-o://4fbe9fa375c2de6d09e0551a0d46765f1b6fc379dc2b0650a0d993bd9a9f8d9f" gracePeriod=30 Oct 03 15:02:11 crc kubenswrapper[4774]: I1003 15:02:11.952548 4774 generic.go:334] "Generic (PLEG): container finished" podID="6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c" containerID="4fbe9fa375c2de6d09e0551a0d46765f1b6fc379dc2b0650a0d993bd9a9f8d9f" exitCode=0 Oct 03 15:02:11 crc kubenswrapper[4774]: I1003 15:02:11.952790 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77bb8d5544-lc44r" event={"ID":"6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c","Type":"ContainerDied","Data":"4fbe9fa375c2de6d09e0551a0d46765f1b6fc379dc2b0650a0d993bd9a9f8d9f"} Oct 03 15:02:12 crc kubenswrapper[4774]: I1003 15:02:12.963486 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82dca456-3f74-4976-a252-27f68a189944","Type":"ContainerStarted","Data":"e89ffcd907f223753cf30bfd035bb731070f165179efe27f1489a10ee205ff48"} Oct 03 15:02:12 crc kubenswrapper[4774]: I1003 15:02:12.963847 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 15:02:12 crc kubenswrapper[4774]: I1003 15:02:12.963616 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="82dca456-3f74-4976-a252-27f68a189944" containerName="ceilometer-central-agent" containerID="cri-o://d47b7a33d34e46154ecd5a37b959bca07fcc8346b172b26189a6c5515a503cef" gracePeriod=30 Oct 03 15:02:12 crc kubenswrapper[4774]: I1003 15:02:12.963652 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="82dca456-3f74-4976-a252-27f68a189944" containerName="proxy-httpd" containerID="cri-o://e89ffcd907f223753cf30bfd035bb731070f165179efe27f1489a10ee205ff48" gracePeriod=30 Oct 03 15:02:12 crc kubenswrapper[4774]: I1003 15:02:12.963674 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="82dca456-3f74-4976-a252-27f68a189944" containerName="ceilometer-notification-agent" containerID="cri-o://60c99f4078afa883ce346bae835c73380aec43d7892ce5cc2c504d2aa6836694" gracePeriod=30 Oct 03 15:02:12 crc kubenswrapper[4774]: I1003 15:02:12.963664 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="82dca456-3f74-4976-a252-27f68a189944" containerName="sg-core" containerID="cri-o://f8e41ac24f1f00f6478f75e26305ea4a0e9c3ba8f26901db674b2fddf85127bd" gracePeriod=30 Oct 03 15:02:12 crc kubenswrapper[4774]: I1003 15:02:12.993638 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.363393062 podStartE2EDuration="5.99362061s" podCreationTimestamp="2025-10-03 15:02:07 +0000 UTC" firstStartedPulling="2025-10-03 15:02:08.033543701 +0000 UTC m=+1150.622747153" lastFinishedPulling="2025-10-03 15:02:11.663771249 +0000 UTC m=+1154.252974701" observedRunningTime="2025-10-03 15:02:12.989725844 +0000 UTC m=+1155.578929296" watchObservedRunningTime="2025-10-03 15:02:12.99362061 +0000 UTC m=+1155.582824062" Oct 03 15:02:13 crc kubenswrapper[4774]: I1003 15:02:13.185905 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 15:02:13 crc kubenswrapper[4774]: I1003 15:02:13.185960 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 15:02:13 crc kubenswrapper[4774]: I1003 15:02:13.216603 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 15:02:13 crc kubenswrapper[4774]: I1003 15:02:13.248762 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 15:02:13 crc kubenswrapper[4774]: I1003 15:02:13.248859 4774 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 15:02:13 crc kubenswrapper[4774]: I1003 15:02:13.254547 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 15:02:13 crc kubenswrapper[4774]: I1003 15:02:13.379736 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 15:02:13 crc kubenswrapper[4774]: I1003 15:02:13.975712 4774 generic.go:334] "Generic (PLEG): container finished" podID="82dca456-3f74-4976-a252-27f68a189944" containerID="e89ffcd907f223753cf30bfd035bb731070f165179efe27f1489a10ee205ff48" exitCode=0 Oct 03 15:02:13 crc kubenswrapper[4774]: I1003 15:02:13.976053 4774 generic.go:334] "Generic (PLEG): container finished" podID="82dca456-3f74-4976-a252-27f68a189944" containerID="f8e41ac24f1f00f6478f75e26305ea4a0e9c3ba8f26901db674b2fddf85127bd" exitCode=2 Oct 03 15:02:13 crc kubenswrapper[4774]: I1003 15:02:13.976067 4774 generic.go:334] "Generic (PLEG): container finished" podID="82dca456-3f74-4976-a252-27f68a189944" containerID="60c99f4078afa883ce346bae835c73380aec43d7892ce5cc2c504d2aa6836694" exitCode=0 Oct 03 15:02:13 crc kubenswrapper[4774]: I1003 15:02:13.975791 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82dca456-3f74-4976-a252-27f68a189944","Type":"ContainerDied","Data":"e89ffcd907f223753cf30bfd035bb731070f165179efe27f1489a10ee205ff48"} Oct 03 15:02:13 crc kubenswrapper[4774]: I1003 15:02:13.976739 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 15:02:13 crc kubenswrapper[4774]: I1003 15:02:13.976757 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82dca456-3f74-4976-a252-27f68a189944","Type":"ContainerDied","Data":"f8e41ac24f1f00f6478f75e26305ea4a0e9c3ba8f26901db674b2fddf85127bd"} Oct 03 15:02:13 crc kubenswrapper[4774]: I1003 15:02:13.976773 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82dca456-3f74-4976-a252-27f68a189944","Type":"ContainerDied","Data":"60c99f4078afa883ce346bae835c73380aec43d7892ce5cc2c504d2aa6836694"} Oct 03 15:02:13 crc kubenswrapper[4774]: I1003 15:02:13.976932 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.579025 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.749956 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82dca456-3f74-4976-a252-27f68a189944-config-data\") pod \"82dca456-3f74-4976-a252-27f68a189944\" (UID: \"82dca456-3f74-4976-a252-27f68a189944\") " Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.750035 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82dca456-3f74-4976-a252-27f68a189944-run-httpd\") pod \"82dca456-3f74-4976-a252-27f68a189944\" (UID: \"82dca456-3f74-4976-a252-27f68a189944\") " Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.750179 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82dca456-3f74-4976-a252-27f68a189944-scripts\") pod \"82dca456-3f74-4976-a252-27f68a189944\" (UID: \"82dca456-3f74-4976-a252-27f68a189944\") " Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.750214 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82dca456-3f74-4976-a252-27f68a189944-sg-core-conf-yaml\") pod \"82dca456-3f74-4976-a252-27f68a189944\" (UID: \"82dca456-3f74-4976-a252-27f68a189944\") " Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.750238 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gsz5\" (UniqueName: \"kubernetes.io/projected/82dca456-3f74-4976-a252-27f68a189944-kube-api-access-8gsz5\") pod \"82dca456-3f74-4976-a252-27f68a189944\" (UID: \"82dca456-3f74-4976-a252-27f68a189944\") " Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.750329 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82dca456-3f74-4976-a252-27f68a189944-log-httpd\") pod \"82dca456-3f74-4976-a252-27f68a189944\" (UID: \"82dca456-3f74-4976-a252-27f68a189944\") " Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.750355 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82dca456-3f74-4976-a252-27f68a189944-combined-ca-bundle\") pod \"82dca456-3f74-4976-a252-27f68a189944\" (UID: \"82dca456-3f74-4976-a252-27f68a189944\") " Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.750411 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82dca456-3f74-4976-a252-27f68a189944-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "82dca456-3f74-4976-a252-27f68a189944" (UID: "82dca456-3f74-4976-a252-27f68a189944"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.750722 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82dca456-3f74-4976-a252-27f68a189944-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "82dca456-3f74-4976-a252-27f68a189944" (UID: "82dca456-3f74-4976-a252-27f68a189944"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.751164 4774 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82dca456-3f74-4976-a252-27f68a189944-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.751192 4774 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82dca456-3f74-4976-a252-27f68a189944-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.760570 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82dca456-3f74-4976-a252-27f68a189944-kube-api-access-8gsz5" (OuterVolumeSpecName: "kube-api-access-8gsz5") pod "82dca456-3f74-4976-a252-27f68a189944" (UID: "82dca456-3f74-4976-a252-27f68a189944"). InnerVolumeSpecName "kube-api-access-8gsz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.767536 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82dca456-3f74-4976-a252-27f68a189944-scripts" (OuterVolumeSpecName: "scripts") pod "82dca456-3f74-4976-a252-27f68a189944" (UID: "82dca456-3f74-4976-a252-27f68a189944"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.786506 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82dca456-3f74-4976-a252-27f68a189944-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "82dca456-3f74-4976-a252-27f68a189944" (UID: "82dca456-3f74-4976-a252-27f68a189944"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.848507 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82dca456-3f74-4976-a252-27f68a189944-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82dca456-3f74-4976-a252-27f68a189944" (UID: "82dca456-3f74-4976-a252-27f68a189944"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.853351 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82dca456-3f74-4976-a252-27f68a189944-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.853403 4774 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82dca456-3f74-4976-a252-27f68a189944-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.853417 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gsz5\" (UniqueName: \"kubernetes.io/projected/82dca456-3f74-4976-a252-27f68a189944-kube-api-access-8gsz5\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.853427 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82dca456-3f74-4976-a252-27f68a189944-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.873427 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82dca456-3f74-4976-a252-27f68a189944-config-data" (OuterVolumeSpecName: "config-data") pod "82dca456-3f74-4976-a252-27f68a189944" (UID: "82dca456-3f74-4976-a252-27f68a189944"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.904567 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wpkth"] Oct 03 15:02:14 crc kubenswrapper[4774]: E1003 15:02:14.904947 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55fbe210-dfe5-4488-9461-8b1f67d30f49" containerName="mariadb-account-create" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.904964 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="55fbe210-dfe5-4488-9461-8b1f67d30f49" containerName="mariadb-account-create" Oct 03 15:02:14 crc kubenswrapper[4774]: E1003 15:02:14.904986 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc84e6e6-c0c7-44f6-8834-0e9f5e52f98c" containerName="mariadb-account-create" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.904992 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc84e6e6-c0c7-44f6-8834-0e9f5e52f98c" containerName="mariadb-account-create" Oct 03 15:02:14 crc kubenswrapper[4774]: E1003 15:02:14.905003 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82dca456-3f74-4976-a252-27f68a189944" containerName="ceilometer-notification-agent" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.905009 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="82dca456-3f74-4976-a252-27f68a189944" containerName="ceilometer-notification-agent" Oct 03 15:02:14 crc kubenswrapper[4774]: E1003 15:02:14.905021 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82dca456-3f74-4976-a252-27f68a189944" containerName="sg-core" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.905026 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="82dca456-3f74-4976-a252-27f68a189944" containerName="sg-core" Oct 03 15:02:14 crc kubenswrapper[4774]: E1003 15:02:14.905091 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82dca456-3f74-4976-a252-27f68a189944" containerName="ceilometer-central-agent" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.905100 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="82dca456-3f74-4976-a252-27f68a189944" containerName="ceilometer-central-agent" Oct 03 15:02:14 crc kubenswrapper[4774]: E1003 15:02:14.905114 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82dca456-3f74-4976-a252-27f68a189944" containerName="proxy-httpd" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.905119 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="82dca456-3f74-4976-a252-27f68a189944" containerName="proxy-httpd" Oct 03 15:02:14 crc kubenswrapper[4774]: E1003 15:02:14.905128 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1965846-33db-41a7-b097-d26e5a398986" containerName="mariadb-account-create" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.905133 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1965846-33db-41a7-b097-d26e5a398986" containerName="mariadb-account-create" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.905281 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="82dca456-3f74-4976-a252-27f68a189944" containerName="proxy-httpd" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.905296 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="82dca456-3f74-4976-a252-27f68a189944" containerName="ceilometer-notification-agent" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.905308 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="55fbe210-dfe5-4488-9461-8b1f67d30f49" containerName="mariadb-account-create" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.905319 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="82dca456-3f74-4976-a252-27f68a189944" containerName="ceilometer-central-agent" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.905329 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="82dca456-3f74-4976-a252-27f68a189944" containerName="sg-core" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.905335 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1965846-33db-41a7-b097-d26e5a398986" containerName="mariadb-account-create" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.905348 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc84e6e6-c0c7-44f6-8834-0e9f5e52f98c" containerName="mariadb-account-create" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.905928 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wpkth" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.913680 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-29swj" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.913942 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.920728 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.923775 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wpkth"] Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.955346 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82dca456-3f74-4976-a252-27f68a189944-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.996271 4774 generic.go:334] "Generic (PLEG): container finished" podID="82dca456-3f74-4976-a252-27f68a189944" containerID="d47b7a33d34e46154ecd5a37b959bca07fcc8346b172b26189a6c5515a503cef" exitCode=0 Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.996639 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.996666 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82dca456-3f74-4976-a252-27f68a189944","Type":"ContainerDied","Data":"d47b7a33d34e46154ecd5a37b959bca07fcc8346b172b26189a6c5515a503cef"} Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.996719 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82dca456-3f74-4976-a252-27f68a189944","Type":"ContainerDied","Data":"c4667b62392f49fd286fc326239862db0ded73b7efc60281e791f6bf53b33f02"} Oct 03 15:02:14 crc kubenswrapper[4774]: I1003 15:02:14.996741 4774 scope.go:117] "RemoveContainer" containerID="e89ffcd907f223753cf30bfd035bb731070f165179efe27f1489a10ee205ff48" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.022057 4774 scope.go:117] "RemoveContainer" containerID="f8e41ac24f1f00f6478f75e26305ea4a0e9c3ba8f26901db674b2fddf85127bd" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.048637 4774 scope.go:117] "RemoveContainer" containerID="60c99f4078afa883ce346bae835c73380aec43d7892ce5cc2c504d2aa6836694" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.048781 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.052702 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.057393 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c24e835e-fdf1-44ec-ad96-3b54ab88253e-scripts\") pod \"nova-cell0-conductor-db-sync-wpkth\" (UID: \"c24e835e-fdf1-44ec-ad96-3b54ab88253e\") " pod="openstack/nova-cell0-conductor-db-sync-wpkth" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.057453 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdmk5\" (UniqueName: \"kubernetes.io/projected/c24e835e-fdf1-44ec-ad96-3b54ab88253e-kube-api-access-cdmk5\") pod \"nova-cell0-conductor-db-sync-wpkth\" (UID: \"c24e835e-fdf1-44ec-ad96-3b54ab88253e\") " pod="openstack/nova-cell0-conductor-db-sync-wpkth" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.057494 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c24e835e-fdf1-44ec-ad96-3b54ab88253e-config-data\") pod \"nova-cell0-conductor-db-sync-wpkth\" (UID: \"c24e835e-fdf1-44ec-ad96-3b54ab88253e\") " pod="openstack/nova-cell0-conductor-db-sync-wpkth" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.057597 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c24e835e-fdf1-44ec-ad96-3b54ab88253e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wpkth\" (UID: \"c24e835e-fdf1-44ec-ad96-3b54ab88253e\") " pod="openstack/nova-cell0-conductor-db-sync-wpkth" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.061447 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.064044 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.066734 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.066958 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.074977 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.075326 4774 scope.go:117] "RemoveContainer" containerID="d47b7a33d34e46154ecd5a37b959bca07fcc8346b172b26189a6c5515a503cef" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.154667 4774 scope.go:117] "RemoveContainer" containerID="e89ffcd907f223753cf30bfd035bb731070f165179efe27f1489a10ee205ff48" Oct 03 15:02:15 crc kubenswrapper[4774]: E1003 15:02:15.155979 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e89ffcd907f223753cf30bfd035bb731070f165179efe27f1489a10ee205ff48\": container with ID starting with e89ffcd907f223753cf30bfd035bb731070f165179efe27f1489a10ee205ff48 not found: ID does not exist" containerID="e89ffcd907f223753cf30bfd035bb731070f165179efe27f1489a10ee205ff48" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.156019 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e89ffcd907f223753cf30bfd035bb731070f165179efe27f1489a10ee205ff48"} err="failed to get container status \"e89ffcd907f223753cf30bfd035bb731070f165179efe27f1489a10ee205ff48\": rpc error: code = NotFound desc = could not find container \"e89ffcd907f223753cf30bfd035bb731070f165179efe27f1489a10ee205ff48\": container with ID starting with e89ffcd907f223753cf30bfd035bb731070f165179efe27f1489a10ee205ff48 not found: ID does not exist" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.156046 4774 scope.go:117] "RemoveContainer" containerID="f8e41ac24f1f00f6478f75e26305ea4a0e9c3ba8f26901db674b2fddf85127bd" Oct 03 15:02:15 crc kubenswrapper[4774]: E1003 15:02:15.156499 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8e41ac24f1f00f6478f75e26305ea4a0e9c3ba8f26901db674b2fddf85127bd\": container with ID starting with f8e41ac24f1f00f6478f75e26305ea4a0e9c3ba8f26901db674b2fddf85127bd not found: ID does not exist" containerID="f8e41ac24f1f00f6478f75e26305ea4a0e9c3ba8f26901db674b2fddf85127bd" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.156535 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8e41ac24f1f00f6478f75e26305ea4a0e9c3ba8f26901db674b2fddf85127bd"} err="failed to get container status \"f8e41ac24f1f00f6478f75e26305ea4a0e9c3ba8f26901db674b2fddf85127bd\": rpc error: code = NotFound desc = could not find container \"f8e41ac24f1f00f6478f75e26305ea4a0e9c3ba8f26901db674b2fddf85127bd\": container with ID starting with f8e41ac24f1f00f6478f75e26305ea4a0e9c3ba8f26901db674b2fddf85127bd not found: ID does not exist" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.156554 4774 scope.go:117] "RemoveContainer" containerID="60c99f4078afa883ce346bae835c73380aec43d7892ce5cc2c504d2aa6836694" Oct 03 15:02:15 crc kubenswrapper[4774]: E1003 15:02:15.156894 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60c99f4078afa883ce346bae835c73380aec43d7892ce5cc2c504d2aa6836694\": container with ID starting with 60c99f4078afa883ce346bae835c73380aec43d7892ce5cc2c504d2aa6836694 not found: ID does not exist" containerID="60c99f4078afa883ce346bae835c73380aec43d7892ce5cc2c504d2aa6836694" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.156920 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60c99f4078afa883ce346bae835c73380aec43d7892ce5cc2c504d2aa6836694"} err="failed to get container status \"60c99f4078afa883ce346bae835c73380aec43d7892ce5cc2c504d2aa6836694\": rpc error: code = NotFound desc = could not find container \"60c99f4078afa883ce346bae835c73380aec43d7892ce5cc2c504d2aa6836694\": container with ID starting with 60c99f4078afa883ce346bae835c73380aec43d7892ce5cc2c504d2aa6836694 not found: ID does not exist" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.156940 4774 scope.go:117] "RemoveContainer" containerID="d47b7a33d34e46154ecd5a37b959bca07fcc8346b172b26189a6c5515a503cef" Oct 03 15:02:15 crc kubenswrapper[4774]: E1003 15:02:15.157163 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d47b7a33d34e46154ecd5a37b959bca07fcc8346b172b26189a6c5515a503cef\": container with ID starting with d47b7a33d34e46154ecd5a37b959bca07fcc8346b172b26189a6c5515a503cef not found: ID does not exist" containerID="d47b7a33d34e46154ecd5a37b959bca07fcc8346b172b26189a6c5515a503cef" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.157185 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d47b7a33d34e46154ecd5a37b959bca07fcc8346b172b26189a6c5515a503cef"} err="failed to get container status \"d47b7a33d34e46154ecd5a37b959bca07fcc8346b172b26189a6c5515a503cef\": rpc error: code = NotFound desc = could not find container \"d47b7a33d34e46154ecd5a37b959bca07fcc8346b172b26189a6c5515a503cef\": container with ID starting with d47b7a33d34e46154ecd5a37b959bca07fcc8346b172b26189a6c5515a503cef not found: ID does not exist" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.159313 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-scripts\") pod \"ceilometer-0\" (UID: \"b5d11ef5-a971-4de4-b14c-94dda7a14bbd\") " pod="openstack/ceilometer-0" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.159353 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-config-data\") pod \"ceilometer-0\" (UID: \"b5d11ef5-a971-4de4-b14c-94dda7a14bbd\") " pod="openstack/ceilometer-0" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.159435 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhl6w\" (UniqueName: \"kubernetes.io/projected/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-kube-api-access-jhl6w\") pod \"ceilometer-0\" (UID: \"b5d11ef5-a971-4de4-b14c-94dda7a14bbd\") " pod="openstack/ceilometer-0" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.159486 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b5d11ef5-a971-4de4-b14c-94dda7a14bbd\") " pod="openstack/ceilometer-0" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.159512 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c24e835e-fdf1-44ec-ad96-3b54ab88253e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wpkth\" (UID: \"c24e835e-fdf1-44ec-ad96-3b54ab88253e\") " pod="openstack/nova-cell0-conductor-db-sync-wpkth" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.159544 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b5d11ef5-a971-4de4-b14c-94dda7a14bbd\") " pod="openstack/ceilometer-0" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.159614 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-log-httpd\") pod \"ceilometer-0\" (UID: \"b5d11ef5-a971-4de4-b14c-94dda7a14bbd\") " pod="openstack/ceilometer-0" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.159648 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c24e835e-fdf1-44ec-ad96-3b54ab88253e-scripts\") pod \"nova-cell0-conductor-db-sync-wpkth\" (UID: \"c24e835e-fdf1-44ec-ad96-3b54ab88253e\") " pod="openstack/nova-cell0-conductor-db-sync-wpkth" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.159684 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-run-httpd\") pod \"ceilometer-0\" (UID: \"b5d11ef5-a971-4de4-b14c-94dda7a14bbd\") " pod="openstack/ceilometer-0" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.159750 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdmk5\" (UniqueName: \"kubernetes.io/projected/c24e835e-fdf1-44ec-ad96-3b54ab88253e-kube-api-access-cdmk5\") pod \"nova-cell0-conductor-db-sync-wpkth\" (UID: \"c24e835e-fdf1-44ec-ad96-3b54ab88253e\") " pod="openstack/nova-cell0-conductor-db-sync-wpkth" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.159779 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c24e835e-fdf1-44ec-ad96-3b54ab88253e-config-data\") pod \"nova-cell0-conductor-db-sync-wpkth\" (UID: \"c24e835e-fdf1-44ec-ad96-3b54ab88253e\") " pod="openstack/nova-cell0-conductor-db-sync-wpkth" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.164318 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c24e835e-fdf1-44ec-ad96-3b54ab88253e-config-data\") pod \"nova-cell0-conductor-db-sync-wpkth\" (UID: \"c24e835e-fdf1-44ec-ad96-3b54ab88253e\") " pod="openstack/nova-cell0-conductor-db-sync-wpkth" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.164741 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c24e835e-fdf1-44ec-ad96-3b54ab88253e-scripts\") pod \"nova-cell0-conductor-db-sync-wpkth\" (UID: \"c24e835e-fdf1-44ec-ad96-3b54ab88253e\") " pod="openstack/nova-cell0-conductor-db-sync-wpkth" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.165798 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c24e835e-fdf1-44ec-ad96-3b54ab88253e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wpkth\" (UID: \"c24e835e-fdf1-44ec-ad96-3b54ab88253e\") " pod="openstack/nova-cell0-conductor-db-sync-wpkth" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.186208 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdmk5\" (UniqueName: \"kubernetes.io/projected/c24e835e-fdf1-44ec-ad96-3b54ab88253e-kube-api-access-cdmk5\") pod \"nova-cell0-conductor-db-sync-wpkth\" (UID: \"c24e835e-fdf1-44ec-ad96-3b54ab88253e\") " pod="openstack/nova-cell0-conductor-db-sync-wpkth" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.261501 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-scripts\") pod \"ceilometer-0\" (UID: \"b5d11ef5-a971-4de4-b14c-94dda7a14bbd\") " pod="openstack/ceilometer-0" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.261569 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-config-data\") pod \"ceilometer-0\" (UID: \"b5d11ef5-a971-4de4-b14c-94dda7a14bbd\") " pod="openstack/ceilometer-0" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.261670 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhl6w\" (UniqueName: \"kubernetes.io/projected/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-kube-api-access-jhl6w\") pod \"ceilometer-0\" (UID: \"b5d11ef5-a971-4de4-b14c-94dda7a14bbd\") " pod="openstack/ceilometer-0" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.262355 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b5d11ef5-a971-4de4-b14c-94dda7a14bbd\") " pod="openstack/ceilometer-0" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.262479 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b5d11ef5-a971-4de4-b14c-94dda7a14bbd\") " pod="openstack/ceilometer-0" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.262571 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-log-httpd\") pod \"ceilometer-0\" (UID: \"b5d11ef5-a971-4de4-b14c-94dda7a14bbd\") " pod="openstack/ceilometer-0" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.262982 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-run-httpd\") pod \"ceilometer-0\" (UID: \"b5d11ef5-a971-4de4-b14c-94dda7a14bbd\") " pod="openstack/ceilometer-0" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.263163 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-log-httpd\") pod \"ceilometer-0\" (UID: \"b5d11ef5-a971-4de4-b14c-94dda7a14bbd\") " pod="openstack/ceilometer-0" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.263393 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-run-httpd\") pod \"ceilometer-0\" (UID: \"b5d11ef5-a971-4de4-b14c-94dda7a14bbd\") " pod="openstack/ceilometer-0" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.265633 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-scripts\") pod \"ceilometer-0\" (UID: \"b5d11ef5-a971-4de4-b14c-94dda7a14bbd\") " pod="openstack/ceilometer-0" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.266249 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-config-data\") pod \"ceilometer-0\" (UID: \"b5d11ef5-a971-4de4-b14c-94dda7a14bbd\") " pod="openstack/ceilometer-0" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.267289 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b5d11ef5-a971-4de4-b14c-94dda7a14bbd\") " pod="openstack/ceilometer-0" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.267947 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wpkth" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.268483 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b5d11ef5-a971-4de4-b14c-94dda7a14bbd\") " pod="openstack/ceilometer-0" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.284827 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhl6w\" (UniqueName: \"kubernetes.io/projected/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-kube-api-access-jhl6w\") pod \"ceilometer-0\" (UID: \"b5d11ef5-a971-4de4-b14c-94dda7a14bbd\") " pod="openstack/ceilometer-0" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.321098 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82dca456-3f74-4976-a252-27f68a189944" path="/var/lib/kubelet/pods/82dca456-3f74-4976-a252-27f68a189944/volumes" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.433494 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.675805 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77bb8d5544-lc44r" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.771806 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c-combined-ca-bundle\") pod \"6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c\" (UID: \"6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c\") " Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.771866 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c-config\") pod \"6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c\" (UID: \"6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c\") " Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.771907 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c-httpd-config\") pod \"6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c\" (UID: \"6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c\") " Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.772016 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c-ovndb-tls-certs\") pod \"6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c\" (UID: \"6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c\") " Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.772080 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dc2jk\" (UniqueName: \"kubernetes.io/projected/6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c-kube-api-access-dc2jk\") pod \"6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c\" (UID: \"6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c\") " Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.778245 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c-kube-api-access-dc2jk" (OuterVolumeSpecName: "kube-api-access-dc2jk") pod "6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c" (UID: "6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c"). InnerVolumeSpecName "kube-api-access-dc2jk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.779517 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c" (UID: "6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.845468 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c-config" (OuterVolumeSpecName: "config") pod "6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c" (UID: "6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.864239 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c" (UID: "6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.874568 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.874606 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c-config\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.874620 4774 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.874634 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dc2jk\" (UniqueName: \"kubernetes.io/projected/6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c-kube-api-access-dc2jk\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.881423 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c" (UID: "6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:02:15 crc kubenswrapper[4774]: W1003 15:02:15.898811 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc24e835e_fdf1_44ec_ad96_3b54ab88253e.slice/crio-e915ed3d43111e79c24e4f10571502de64dd7cce2b2c76e7a78abf5beff480c7 WatchSource:0}: Error finding container e915ed3d43111e79c24e4f10571502de64dd7cce2b2c76e7a78abf5beff480c7: Status 404 returned error can't find the container with id e915ed3d43111e79c24e4f10571502de64dd7cce2b2c76e7a78abf5beff480c7 Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.903307 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wpkth"] Oct 03 15:02:15 crc kubenswrapper[4774]: I1003 15:02:15.976304 4774 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:16 crc kubenswrapper[4774]: I1003 15:02:16.006895 4774 generic.go:334] "Generic (PLEG): container finished" podID="6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c" containerID="6abece3013abb91833b7834b0416ec0d019a4edbf03c419fb55d137beffaa70a" exitCode=0 Oct 03 15:02:16 crc kubenswrapper[4774]: I1003 15:02:16.006956 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77bb8d5544-lc44r" event={"ID":"6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c","Type":"ContainerDied","Data":"6abece3013abb91833b7834b0416ec0d019a4edbf03c419fb55d137beffaa70a"} Oct 03 15:02:16 crc kubenswrapper[4774]: I1003 15:02:16.006987 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77bb8d5544-lc44r" event={"ID":"6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c","Type":"ContainerDied","Data":"cd6cfbc3bcde1deb6f275865320f661a41ba64b3f82471351552d04b5a74a4c5"} Oct 03 15:02:16 crc kubenswrapper[4774]: I1003 15:02:16.006994 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77bb8d5544-lc44r" Oct 03 15:02:16 crc kubenswrapper[4774]: I1003 15:02:16.007005 4774 scope.go:117] "RemoveContainer" containerID="4fbe9fa375c2de6d09e0551a0d46765f1b6fc379dc2b0650a0d993bd9a9f8d9f" Oct 03 15:02:16 crc kubenswrapper[4774]: I1003 15:02:16.011348 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wpkth" event={"ID":"c24e835e-fdf1-44ec-ad96-3b54ab88253e","Type":"ContainerStarted","Data":"e915ed3d43111e79c24e4f10571502de64dd7cce2b2c76e7a78abf5beff480c7"} Oct 03 15:02:16 crc kubenswrapper[4774]: I1003 15:02:16.028572 4774 scope.go:117] "RemoveContainer" containerID="6abece3013abb91833b7834b0416ec0d019a4edbf03c419fb55d137beffaa70a" Oct 03 15:02:16 crc kubenswrapper[4774]: I1003 15:02:16.036769 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:02:16 crc kubenswrapper[4774]: I1003 15:02:16.054511 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-77bb8d5544-lc44r"] Oct 03 15:02:16 crc kubenswrapper[4774]: I1003 15:02:16.096648 4774 scope.go:117] "RemoveContainer" containerID="4fbe9fa375c2de6d09e0551a0d46765f1b6fc379dc2b0650a0d993bd9a9f8d9f" Oct 03 15:02:16 crc kubenswrapper[4774]: E1003 15:02:16.097195 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fbe9fa375c2de6d09e0551a0d46765f1b6fc379dc2b0650a0d993bd9a9f8d9f\": container with ID starting with 4fbe9fa375c2de6d09e0551a0d46765f1b6fc379dc2b0650a0d993bd9a9f8d9f not found: ID does not exist" containerID="4fbe9fa375c2de6d09e0551a0d46765f1b6fc379dc2b0650a0d993bd9a9f8d9f" Oct 03 15:02:16 crc kubenswrapper[4774]: I1003 15:02:16.097235 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fbe9fa375c2de6d09e0551a0d46765f1b6fc379dc2b0650a0d993bd9a9f8d9f"} err="failed to get container status \"4fbe9fa375c2de6d09e0551a0d46765f1b6fc379dc2b0650a0d993bd9a9f8d9f\": rpc error: code = NotFound desc = could not find container \"4fbe9fa375c2de6d09e0551a0d46765f1b6fc379dc2b0650a0d993bd9a9f8d9f\": container with ID starting with 4fbe9fa375c2de6d09e0551a0d46765f1b6fc379dc2b0650a0d993bd9a9f8d9f not found: ID does not exist" Oct 03 15:02:16 crc kubenswrapper[4774]: I1003 15:02:16.097263 4774 scope.go:117] "RemoveContainer" containerID="6abece3013abb91833b7834b0416ec0d019a4edbf03c419fb55d137beffaa70a" Oct 03 15:02:16 crc kubenswrapper[4774]: E1003 15:02:16.097652 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6abece3013abb91833b7834b0416ec0d019a4edbf03c419fb55d137beffaa70a\": container with ID starting with 6abece3013abb91833b7834b0416ec0d019a4edbf03c419fb55d137beffaa70a not found: ID does not exist" containerID="6abece3013abb91833b7834b0416ec0d019a4edbf03c419fb55d137beffaa70a" Oct 03 15:02:16 crc kubenswrapper[4774]: I1003 15:02:16.097685 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6abece3013abb91833b7834b0416ec0d019a4edbf03c419fb55d137beffaa70a"} err="failed to get container status \"6abece3013abb91833b7834b0416ec0d019a4edbf03c419fb55d137beffaa70a\": rpc error: code = NotFound desc = could not find container \"6abece3013abb91833b7834b0416ec0d019a4edbf03c419fb55d137beffaa70a\": container with ID starting with 6abece3013abb91833b7834b0416ec0d019a4edbf03c419fb55d137beffaa70a not found: ID does not exist" Oct 03 15:02:16 crc kubenswrapper[4774]: I1003 15:02:16.121243 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-77bb8d5544-lc44r"] Oct 03 15:02:16 crc kubenswrapper[4774]: I1003 15:02:16.499722 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 15:02:16 crc kubenswrapper[4774]: I1003 15:02:16.500018 4774 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 15:02:16 crc kubenswrapper[4774]: I1003 15:02:16.829805 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:02:16 crc kubenswrapper[4774]: I1003 15:02:16.991253 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 15:02:17 crc kubenswrapper[4774]: I1003 15:02:17.038063 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b5d11ef5-a971-4de4-b14c-94dda7a14bbd","Type":"ContainerStarted","Data":"c2d9303de5a6c755d35f45b773e52468c1a6084003abd1875b67115bd32c86b6"} Oct 03 15:02:17 crc kubenswrapper[4774]: I1003 15:02:17.038926 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b5d11ef5-a971-4de4-b14c-94dda7a14bbd","Type":"ContainerStarted","Data":"f135db6fadef5bd5f26ff4a410163c852294d92e179c2aa3932188111e44fb71"} Oct 03 15:02:17 crc kubenswrapper[4774]: I1003 15:02:17.312574 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c" path="/var/lib/kubelet/pods/6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c/volumes" Oct 03 15:02:18 crc kubenswrapper[4774]: I1003 15:02:18.058622 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b5d11ef5-a971-4de4-b14c-94dda7a14bbd","Type":"ContainerStarted","Data":"b20dfe7376a44bc9f3b32d0f25ab182c8bcf66afb501850833b1ac93015022b0"} Oct 03 15:02:19 crc kubenswrapper[4774]: I1003 15:02:19.071899 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b5d11ef5-a971-4de4-b14c-94dda7a14bbd","Type":"ContainerStarted","Data":"05d6297fd6453621fa2033d582686766b199331526522a2998690d57e20dcbbc"} Oct 03 15:02:20 crc kubenswrapper[4774]: I1003 15:02:20.653800 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:02:20 crc kubenswrapper[4774]: I1003 15:02:20.654388 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:02:25 crc kubenswrapper[4774]: I1003 15:02:25.131713 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wpkth" event={"ID":"c24e835e-fdf1-44ec-ad96-3b54ab88253e","Type":"ContainerStarted","Data":"125da0755aa46a51b1d2761dd1db9566f71c631bfdd9f55ac18319b30da33858"} Oct 03 15:02:25 crc kubenswrapper[4774]: I1003 15:02:25.134497 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b5d11ef5-a971-4de4-b14c-94dda7a14bbd","Type":"ContainerStarted","Data":"89632d8b468b690f38fc770bac5579fc7354d4bb294d39dc9b642d4b28f5c31b"} Oct 03 15:02:25 crc kubenswrapper[4774]: I1003 15:02:25.134676 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b5d11ef5-a971-4de4-b14c-94dda7a14bbd" containerName="sg-core" containerID="cri-o://05d6297fd6453621fa2033d582686766b199331526522a2998690d57e20dcbbc" gracePeriod=30 Oct 03 15:02:25 crc kubenswrapper[4774]: I1003 15:02:25.134728 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b5d11ef5-a971-4de4-b14c-94dda7a14bbd" containerName="proxy-httpd" containerID="cri-o://89632d8b468b690f38fc770bac5579fc7354d4bb294d39dc9b642d4b28f5c31b" gracePeriod=30 Oct 03 15:02:25 crc kubenswrapper[4774]: I1003 15:02:25.134648 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b5d11ef5-a971-4de4-b14c-94dda7a14bbd" containerName="ceilometer-central-agent" containerID="cri-o://c2d9303de5a6c755d35f45b773e52468c1a6084003abd1875b67115bd32c86b6" gracePeriod=30 Oct 03 15:02:25 crc kubenswrapper[4774]: I1003 15:02:25.134690 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 15:02:25 crc kubenswrapper[4774]: I1003 15:02:25.134849 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b5d11ef5-a971-4de4-b14c-94dda7a14bbd" containerName="ceilometer-notification-agent" containerID="cri-o://b20dfe7376a44bc9f3b32d0f25ab182c8bcf66afb501850833b1ac93015022b0" gracePeriod=30 Oct 03 15:02:25 crc kubenswrapper[4774]: I1003 15:02:25.151524 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-wpkth" podStartSLOduration=3.228044523 podStartE2EDuration="11.151504412s" podCreationTimestamp="2025-10-03 15:02:14 +0000 UTC" firstStartedPulling="2025-10-03 15:02:15.903261277 +0000 UTC m=+1158.492464729" lastFinishedPulling="2025-10-03 15:02:23.826721146 +0000 UTC m=+1166.415924618" observedRunningTime="2025-10-03 15:02:25.14942728 +0000 UTC m=+1167.738630732" watchObservedRunningTime="2025-10-03 15:02:25.151504412 +0000 UTC m=+1167.740707864" Oct 03 15:02:25 crc kubenswrapper[4774]: I1003 15:02:25.175964 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.446701502 podStartE2EDuration="10.175942397s" podCreationTimestamp="2025-10-03 15:02:15 +0000 UTC" firstStartedPulling="2025-10-03 15:02:16.097597904 +0000 UTC m=+1158.686801356" lastFinishedPulling="2025-10-03 15:02:23.826838789 +0000 UTC m=+1166.416042251" observedRunningTime="2025-10-03 15:02:25.168188265 +0000 UTC m=+1167.757391707" watchObservedRunningTime="2025-10-03 15:02:25.175942397 +0000 UTC m=+1167.765145849" Oct 03 15:02:26 crc kubenswrapper[4774]: I1003 15:02:26.143554 4774 generic.go:334] "Generic (PLEG): container finished" podID="b5d11ef5-a971-4de4-b14c-94dda7a14bbd" containerID="89632d8b468b690f38fc770bac5579fc7354d4bb294d39dc9b642d4b28f5c31b" exitCode=0 Oct 03 15:02:26 crc kubenswrapper[4774]: I1003 15:02:26.143589 4774 generic.go:334] "Generic (PLEG): container finished" podID="b5d11ef5-a971-4de4-b14c-94dda7a14bbd" containerID="05d6297fd6453621fa2033d582686766b199331526522a2998690d57e20dcbbc" exitCode=2 Oct 03 15:02:26 crc kubenswrapper[4774]: I1003 15:02:26.143601 4774 generic.go:334] "Generic (PLEG): container finished" podID="b5d11ef5-a971-4de4-b14c-94dda7a14bbd" containerID="b20dfe7376a44bc9f3b32d0f25ab182c8bcf66afb501850833b1ac93015022b0" exitCode=0 Oct 03 15:02:26 crc kubenswrapper[4774]: I1003 15:02:26.143608 4774 generic.go:334] "Generic (PLEG): container finished" podID="b5d11ef5-a971-4de4-b14c-94dda7a14bbd" containerID="c2d9303de5a6c755d35f45b773e52468c1a6084003abd1875b67115bd32c86b6" exitCode=0 Oct 03 15:02:26 crc kubenswrapper[4774]: I1003 15:02:26.143665 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b5d11ef5-a971-4de4-b14c-94dda7a14bbd","Type":"ContainerDied","Data":"89632d8b468b690f38fc770bac5579fc7354d4bb294d39dc9b642d4b28f5c31b"} Oct 03 15:02:26 crc kubenswrapper[4774]: I1003 15:02:26.143743 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b5d11ef5-a971-4de4-b14c-94dda7a14bbd","Type":"ContainerDied","Data":"05d6297fd6453621fa2033d582686766b199331526522a2998690d57e20dcbbc"} Oct 03 15:02:26 crc kubenswrapper[4774]: I1003 15:02:26.143768 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b5d11ef5-a971-4de4-b14c-94dda7a14bbd","Type":"ContainerDied","Data":"b20dfe7376a44bc9f3b32d0f25ab182c8bcf66afb501850833b1ac93015022b0"} Oct 03 15:02:26 crc kubenswrapper[4774]: I1003 15:02:26.143785 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b5d11ef5-a971-4de4-b14c-94dda7a14bbd","Type":"ContainerDied","Data":"c2d9303de5a6c755d35f45b773e52468c1a6084003abd1875b67115bd32c86b6"} Oct 03 15:02:26 crc kubenswrapper[4774]: I1003 15:02:26.638789 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 15:02:26 crc kubenswrapper[4774]: I1003 15:02:26.708138 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-combined-ca-bundle\") pod \"b5d11ef5-a971-4de4-b14c-94dda7a14bbd\" (UID: \"b5d11ef5-a971-4de4-b14c-94dda7a14bbd\") " Oct 03 15:02:26 crc kubenswrapper[4774]: I1003 15:02:26.708229 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-config-data\") pod \"b5d11ef5-a971-4de4-b14c-94dda7a14bbd\" (UID: \"b5d11ef5-a971-4de4-b14c-94dda7a14bbd\") " Oct 03 15:02:26 crc kubenswrapper[4774]: I1003 15:02:26.708278 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-scripts\") pod \"b5d11ef5-a971-4de4-b14c-94dda7a14bbd\" (UID: \"b5d11ef5-a971-4de4-b14c-94dda7a14bbd\") " Oct 03 15:02:26 crc kubenswrapper[4774]: I1003 15:02:26.708345 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhl6w\" (UniqueName: \"kubernetes.io/projected/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-kube-api-access-jhl6w\") pod \"b5d11ef5-a971-4de4-b14c-94dda7a14bbd\" (UID: \"b5d11ef5-a971-4de4-b14c-94dda7a14bbd\") " Oct 03 15:02:26 crc kubenswrapper[4774]: I1003 15:02:26.708464 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-sg-core-conf-yaml\") pod \"b5d11ef5-a971-4de4-b14c-94dda7a14bbd\" (UID: \"b5d11ef5-a971-4de4-b14c-94dda7a14bbd\") " Oct 03 15:02:26 crc kubenswrapper[4774]: I1003 15:02:26.708528 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-run-httpd\") pod \"b5d11ef5-a971-4de4-b14c-94dda7a14bbd\" (UID: \"b5d11ef5-a971-4de4-b14c-94dda7a14bbd\") " Oct 03 15:02:26 crc kubenswrapper[4774]: I1003 15:02:26.708568 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-log-httpd\") pod \"b5d11ef5-a971-4de4-b14c-94dda7a14bbd\" (UID: \"b5d11ef5-a971-4de4-b14c-94dda7a14bbd\") " Oct 03 15:02:26 crc kubenswrapper[4774]: I1003 15:02:26.710842 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b5d11ef5-a971-4de4-b14c-94dda7a14bbd" (UID: "b5d11ef5-a971-4de4-b14c-94dda7a14bbd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:02:26 crc kubenswrapper[4774]: I1003 15:02:26.711249 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b5d11ef5-a971-4de4-b14c-94dda7a14bbd" (UID: "b5d11ef5-a971-4de4-b14c-94dda7a14bbd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:02:26 crc kubenswrapper[4774]: I1003 15:02:26.718208 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-kube-api-access-jhl6w" (OuterVolumeSpecName: "kube-api-access-jhl6w") pod "b5d11ef5-a971-4de4-b14c-94dda7a14bbd" (UID: "b5d11ef5-a971-4de4-b14c-94dda7a14bbd"). InnerVolumeSpecName "kube-api-access-jhl6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:02:26 crc kubenswrapper[4774]: I1003 15:02:26.730442 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-scripts" (OuterVolumeSpecName: "scripts") pod "b5d11ef5-a971-4de4-b14c-94dda7a14bbd" (UID: "b5d11ef5-a971-4de4-b14c-94dda7a14bbd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:02:26 crc kubenswrapper[4774]: I1003 15:02:26.758316 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b5d11ef5-a971-4de4-b14c-94dda7a14bbd" (UID: "b5d11ef5-a971-4de4-b14c-94dda7a14bbd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:02:26 crc kubenswrapper[4774]: I1003 15:02:26.779956 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5d11ef5-a971-4de4-b14c-94dda7a14bbd" (UID: "b5d11ef5-a971-4de4-b14c-94dda7a14bbd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:02:26 crc kubenswrapper[4774]: I1003 15:02:26.810802 4774 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:26 crc kubenswrapper[4774]: I1003 15:02:26.811171 4774 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:26 crc kubenswrapper[4774]: I1003 15:02:26.811184 4774 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:26 crc kubenswrapper[4774]: I1003 15:02:26.811197 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:26 crc kubenswrapper[4774]: I1003 15:02:26.811210 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:26 crc kubenswrapper[4774]: I1003 15:02:26.811221 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhl6w\" (UniqueName: \"kubernetes.io/projected/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-kube-api-access-jhl6w\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:26 crc kubenswrapper[4774]: I1003 15:02:26.837986 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-config-data" (OuterVolumeSpecName: "config-data") pod "b5d11ef5-a971-4de4-b14c-94dda7a14bbd" (UID: "b5d11ef5-a971-4de4-b14c-94dda7a14bbd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:02:26 crc kubenswrapper[4774]: I1003 15:02:26.912863 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5d11ef5-a971-4de4-b14c-94dda7a14bbd-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.159118 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b5d11ef5-a971-4de4-b14c-94dda7a14bbd","Type":"ContainerDied","Data":"f135db6fadef5bd5f26ff4a410163c852294d92e179c2aa3932188111e44fb71"} Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.159167 4774 scope.go:117] "RemoveContainer" containerID="89632d8b468b690f38fc770bac5579fc7354d4bb294d39dc9b642d4b28f5c31b" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.160674 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.187286 4774 scope.go:117] "RemoveContainer" containerID="05d6297fd6453621fa2033d582686766b199331526522a2998690d57e20dcbbc" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.204328 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.210191 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.231667 4774 scope.go:117] "RemoveContainer" containerID="b20dfe7376a44bc9f3b32d0f25ab182c8bcf66afb501850833b1ac93015022b0" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.262232 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:02:27 crc kubenswrapper[4774]: E1003 15:02:27.262877 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c" containerName="neutron-httpd" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.262895 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c" containerName="neutron-httpd" Oct 03 15:02:27 crc kubenswrapper[4774]: E1003 15:02:27.262917 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c" containerName="neutron-api" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.262924 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c" containerName="neutron-api" Oct 03 15:02:27 crc kubenswrapper[4774]: E1003 15:02:27.262948 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5d11ef5-a971-4de4-b14c-94dda7a14bbd" containerName="ceilometer-central-agent" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.262955 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5d11ef5-a971-4de4-b14c-94dda7a14bbd" containerName="ceilometer-central-agent" Oct 03 15:02:27 crc kubenswrapper[4774]: E1003 15:02:27.262979 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5d11ef5-a971-4de4-b14c-94dda7a14bbd" containerName="sg-core" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.262985 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5d11ef5-a971-4de4-b14c-94dda7a14bbd" containerName="sg-core" Oct 03 15:02:27 crc kubenswrapper[4774]: E1003 15:02:27.263015 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5d11ef5-a971-4de4-b14c-94dda7a14bbd" containerName="ceilometer-notification-agent" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.263021 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5d11ef5-a971-4de4-b14c-94dda7a14bbd" containerName="ceilometer-notification-agent" Oct 03 15:02:27 crc kubenswrapper[4774]: E1003 15:02:27.263038 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5d11ef5-a971-4de4-b14c-94dda7a14bbd" containerName="proxy-httpd" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.263044 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5d11ef5-a971-4de4-b14c-94dda7a14bbd" containerName="proxy-httpd" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.263391 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5d11ef5-a971-4de4-b14c-94dda7a14bbd" containerName="ceilometer-central-agent" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.263410 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5d11ef5-a971-4de4-b14c-94dda7a14bbd" containerName="sg-core" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.263427 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5d11ef5-a971-4de4-b14c-94dda7a14bbd" containerName="ceilometer-notification-agent" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.263438 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c" containerName="neutron-api" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.263451 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5d11ef5-a971-4de4-b14c-94dda7a14bbd" containerName="proxy-httpd" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.263465 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="6171a5b5-11a3-4e9a-8fbd-11b43fc3e93c" containerName="neutron-httpd" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.267490 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.269877 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.270751 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.273092 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.276336 4774 scope.go:117] "RemoveContainer" containerID="c2d9303de5a6c755d35f45b773e52468c1a6084003abd1875b67115bd32c86b6" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.309729 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5d11ef5-a971-4de4-b14c-94dda7a14bbd" path="/var/lib/kubelet/pods/b5d11ef5-a971-4de4-b14c-94dda7a14bbd/volumes" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.421934 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61b07b0f-462d-4e27-b251-85a8d869433a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"61b07b0f-462d-4e27-b251-85a8d869433a\") " pod="openstack/ceilometer-0" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.422068 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61b07b0f-462d-4e27-b251-85a8d869433a-log-httpd\") pod \"ceilometer-0\" (UID: \"61b07b0f-462d-4e27-b251-85a8d869433a\") " pod="openstack/ceilometer-0" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.422192 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61b07b0f-462d-4e27-b251-85a8d869433a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"61b07b0f-462d-4e27-b251-85a8d869433a\") " pod="openstack/ceilometer-0" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.422215 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkhsl\" (UniqueName: \"kubernetes.io/projected/61b07b0f-462d-4e27-b251-85a8d869433a-kube-api-access-xkhsl\") pod \"ceilometer-0\" (UID: \"61b07b0f-462d-4e27-b251-85a8d869433a\") " pod="openstack/ceilometer-0" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.422243 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61b07b0f-462d-4e27-b251-85a8d869433a-run-httpd\") pod \"ceilometer-0\" (UID: \"61b07b0f-462d-4e27-b251-85a8d869433a\") " pod="openstack/ceilometer-0" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.422284 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61b07b0f-462d-4e27-b251-85a8d869433a-scripts\") pod \"ceilometer-0\" (UID: \"61b07b0f-462d-4e27-b251-85a8d869433a\") " pod="openstack/ceilometer-0" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.422306 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61b07b0f-462d-4e27-b251-85a8d869433a-config-data\") pod \"ceilometer-0\" (UID: \"61b07b0f-462d-4e27-b251-85a8d869433a\") " pod="openstack/ceilometer-0" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.524407 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61b07b0f-462d-4e27-b251-85a8d869433a-config-data\") pod \"ceilometer-0\" (UID: \"61b07b0f-462d-4e27-b251-85a8d869433a\") " pod="openstack/ceilometer-0" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.524561 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61b07b0f-462d-4e27-b251-85a8d869433a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"61b07b0f-462d-4e27-b251-85a8d869433a\") " pod="openstack/ceilometer-0" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.524691 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61b07b0f-462d-4e27-b251-85a8d869433a-log-httpd\") pod \"ceilometer-0\" (UID: \"61b07b0f-462d-4e27-b251-85a8d869433a\") " pod="openstack/ceilometer-0" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.524819 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkhsl\" (UniqueName: \"kubernetes.io/projected/61b07b0f-462d-4e27-b251-85a8d869433a-kube-api-access-xkhsl\") pod \"ceilometer-0\" (UID: \"61b07b0f-462d-4e27-b251-85a8d869433a\") " pod="openstack/ceilometer-0" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.525224 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61b07b0f-462d-4e27-b251-85a8d869433a-log-httpd\") pod \"ceilometer-0\" (UID: \"61b07b0f-462d-4e27-b251-85a8d869433a\") " pod="openstack/ceilometer-0" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.525887 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61b07b0f-462d-4e27-b251-85a8d869433a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"61b07b0f-462d-4e27-b251-85a8d869433a\") " pod="openstack/ceilometer-0" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.526501 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61b07b0f-462d-4e27-b251-85a8d869433a-run-httpd\") pod \"ceilometer-0\" (UID: \"61b07b0f-462d-4e27-b251-85a8d869433a\") " pod="openstack/ceilometer-0" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.526621 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61b07b0f-462d-4e27-b251-85a8d869433a-scripts\") pod \"ceilometer-0\" (UID: \"61b07b0f-462d-4e27-b251-85a8d869433a\") " pod="openstack/ceilometer-0" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.526974 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61b07b0f-462d-4e27-b251-85a8d869433a-run-httpd\") pod \"ceilometer-0\" (UID: \"61b07b0f-462d-4e27-b251-85a8d869433a\") " pod="openstack/ceilometer-0" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.529341 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61b07b0f-462d-4e27-b251-85a8d869433a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"61b07b0f-462d-4e27-b251-85a8d869433a\") " pod="openstack/ceilometer-0" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.529530 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61b07b0f-462d-4e27-b251-85a8d869433a-config-data\") pod \"ceilometer-0\" (UID: \"61b07b0f-462d-4e27-b251-85a8d869433a\") " pod="openstack/ceilometer-0" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.530674 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61b07b0f-462d-4e27-b251-85a8d869433a-scripts\") pod \"ceilometer-0\" (UID: \"61b07b0f-462d-4e27-b251-85a8d869433a\") " pod="openstack/ceilometer-0" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.531602 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61b07b0f-462d-4e27-b251-85a8d869433a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"61b07b0f-462d-4e27-b251-85a8d869433a\") " pod="openstack/ceilometer-0" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.536798 4774 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod9ab201e1-9ef3-485b-81f2-0b421dcc66cc"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod9ab201e1-9ef3-485b-81f2-0b421dcc66cc] : Timed out while waiting for systemd to remove kubepods-besteffort-pod9ab201e1_9ef3_485b_81f2_0b421dcc66cc.slice" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.542499 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkhsl\" (UniqueName: \"kubernetes.io/projected/61b07b0f-462d-4e27-b251-85a8d869433a-kube-api-access-xkhsl\") pod \"ceilometer-0\" (UID: \"61b07b0f-462d-4e27-b251-85a8d869433a\") " pod="openstack/ceilometer-0" Oct 03 15:02:27 crc kubenswrapper[4774]: I1003 15:02:27.594872 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 15:02:28 crc kubenswrapper[4774]: I1003 15:02:28.090542 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:02:28 crc kubenswrapper[4774]: I1003 15:02:28.173592 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61b07b0f-462d-4e27-b251-85a8d869433a","Type":"ContainerStarted","Data":"5a51b2b323af232df00b84fc8bc22140278810604ea5f319086d8be90fdd51ba"} Oct 03 15:02:29 crc kubenswrapper[4774]: I1003 15:02:29.187623 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61b07b0f-462d-4e27-b251-85a8d869433a","Type":"ContainerStarted","Data":"b89bf2ce3a3e5e5815366e78eddaa47df6eb0e87baf85f18fcf762d1df57e0ef"} Oct 03 15:02:30 crc kubenswrapper[4774]: I1003 15:02:30.200577 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61b07b0f-462d-4e27-b251-85a8d869433a","Type":"ContainerStarted","Data":"a86fa64b9193a9e410ebadfa3c2adbe8f55679bfc01b57e812625095056114f9"} Oct 03 15:02:31 crc kubenswrapper[4774]: I1003 15:02:31.213049 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61b07b0f-462d-4e27-b251-85a8d869433a","Type":"ContainerStarted","Data":"596fe207acb42e3216e639196ebe78bef80eec721f99081ede5173ad751acc80"} Oct 03 15:02:32 crc kubenswrapper[4774]: I1003 15:02:32.225860 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61b07b0f-462d-4e27-b251-85a8d869433a","Type":"ContainerStarted","Data":"e77bd534b5300e2440ac57b7cae4166aab2355bf4eb237feec3fc435c863d7a4"} Oct 03 15:02:32 crc kubenswrapper[4774]: I1003 15:02:32.226700 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 15:02:32 crc kubenswrapper[4774]: I1003 15:02:32.256976 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.395726992 podStartE2EDuration="5.256955925s" podCreationTimestamp="2025-10-03 15:02:27 +0000 UTC" firstStartedPulling="2025-10-03 15:02:28.087864541 +0000 UTC m=+1170.677068013" lastFinishedPulling="2025-10-03 15:02:31.949093494 +0000 UTC m=+1174.538296946" observedRunningTime="2025-10-03 15:02:32.249177192 +0000 UTC m=+1174.838380644" watchObservedRunningTime="2025-10-03 15:02:32.256955925 +0000 UTC m=+1174.846159377" Oct 03 15:02:35 crc kubenswrapper[4774]: I1003 15:02:35.253339 4774 generic.go:334] "Generic (PLEG): container finished" podID="c24e835e-fdf1-44ec-ad96-3b54ab88253e" containerID="125da0755aa46a51b1d2761dd1db9566f71c631bfdd9f55ac18319b30da33858" exitCode=0 Oct 03 15:02:35 crc kubenswrapper[4774]: I1003 15:02:35.253442 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wpkth" event={"ID":"c24e835e-fdf1-44ec-ad96-3b54ab88253e","Type":"ContainerDied","Data":"125da0755aa46a51b1d2761dd1db9566f71c631bfdd9f55ac18319b30da33858"} Oct 03 15:02:36 crc kubenswrapper[4774]: I1003 15:02:36.651313 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wpkth" Oct 03 15:02:36 crc kubenswrapper[4774]: I1003 15:02:36.741092 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c24e835e-fdf1-44ec-ad96-3b54ab88253e-scripts\") pod \"c24e835e-fdf1-44ec-ad96-3b54ab88253e\" (UID: \"c24e835e-fdf1-44ec-ad96-3b54ab88253e\") " Oct 03 15:02:36 crc kubenswrapper[4774]: I1003 15:02:36.741171 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c24e835e-fdf1-44ec-ad96-3b54ab88253e-config-data\") pod \"c24e835e-fdf1-44ec-ad96-3b54ab88253e\" (UID: \"c24e835e-fdf1-44ec-ad96-3b54ab88253e\") " Oct 03 15:02:36 crc kubenswrapper[4774]: I1003 15:02:36.741240 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c24e835e-fdf1-44ec-ad96-3b54ab88253e-combined-ca-bundle\") pod \"c24e835e-fdf1-44ec-ad96-3b54ab88253e\" (UID: \"c24e835e-fdf1-44ec-ad96-3b54ab88253e\") " Oct 03 15:02:36 crc kubenswrapper[4774]: I1003 15:02:36.741296 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdmk5\" (UniqueName: \"kubernetes.io/projected/c24e835e-fdf1-44ec-ad96-3b54ab88253e-kube-api-access-cdmk5\") pod \"c24e835e-fdf1-44ec-ad96-3b54ab88253e\" (UID: \"c24e835e-fdf1-44ec-ad96-3b54ab88253e\") " Oct 03 15:02:36 crc kubenswrapper[4774]: I1003 15:02:36.764539 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c24e835e-fdf1-44ec-ad96-3b54ab88253e-scripts" (OuterVolumeSpecName: "scripts") pod "c24e835e-fdf1-44ec-ad96-3b54ab88253e" (UID: "c24e835e-fdf1-44ec-ad96-3b54ab88253e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:02:36 crc kubenswrapper[4774]: I1003 15:02:36.767805 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c24e835e-fdf1-44ec-ad96-3b54ab88253e-kube-api-access-cdmk5" (OuterVolumeSpecName: "kube-api-access-cdmk5") pod "c24e835e-fdf1-44ec-ad96-3b54ab88253e" (UID: "c24e835e-fdf1-44ec-ad96-3b54ab88253e"). InnerVolumeSpecName "kube-api-access-cdmk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:02:36 crc kubenswrapper[4774]: I1003 15:02:36.777864 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c24e835e-fdf1-44ec-ad96-3b54ab88253e-config-data" (OuterVolumeSpecName: "config-data") pod "c24e835e-fdf1-44ec-ad96-3b54ab88253e" (UID: "c24e835e-fdf1-44ec-ad96-3b54ab88253e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:02:36 crc kubenswrapper[4774]: I1003 15:02:36.778566 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c24e835e-fdf1-44ec-ad96-3b54ab88253e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c24e835e-fdf1-44ec-ad96-3b54ab88253e" (UID: "c24e835e-fdf1-44ec-ad96-3b54ab88253e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:02:36 crc kubenswrapper[4774]: I1003 15:02:36.844919 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdmk5\" (UniqueName: \"kubernetes.io/projected/c24e835e-fdf1-44ec-ad96-3b54ab88253e-kube-api-access-cdmk5\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:36 crc kubenswrapper[4774]: I1003 15:02:36.845162 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c24e835e-fdf1-44ec-ad96-3b54ab88253e-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:36 crc kubenswrapper[4774]: I1003 15:02:36.845221 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c24e835e-fdf1-44ec-ad96-3b54ab88253e-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:36 crc kubenswrapper[4774]: I1003 15:02:36.845284 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c24e835e-fdf1-44ec-ad96-3b54ab88253e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:37 crc kubenswrapper[4774]: I1003 15:02:37.274269 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wpkth" event={"ID":"c24e835e-fdf1-44ec-ad96-3b54ab88253e","Type":"ContainerDied","Data":"e915ed3d43111e79c24e4f10571502de64dd7cce2b2c76e7a78abf5beff480c7"} Oct 03 15:02:37 crc kubenswrapper[4774]: I1003 15:02:37.274627 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e915ed3d43111e79c24e4f10571502de64dd7cce2b2c76e7a78abf5beff480c7" Oct 03 15:02:37 crc kubenswrapper[4774]: I1003 15:02:37.274315 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wpkth" Oct 03 15:02:37 crc kubenswrapper[4774]: I1003 15:02:37.381144 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 15:02:37 crc kubenswrapper[4774]: E1003 15:02:37.381604 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24e835e-fdf1-44ec-ad96-3b54ab88253e" containerName="nova-cell0-conductor-db-sync" Oct 03 15:02:37 crc kubenswrapper[4774]: I1003 15:02:37.381626 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24e835e-fdf1-44ec-ad96-3b54ab88253e" containerName="nova-cell0-conductor-db-sync" Oct 03 15:02:37 crc kubenswrapper[4774]: I1003 15:02:37.381879 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="c24e835e-fdf1-44ec-ad96-3b54ab88253e" containerName="nova-cell0-conductor-db-sync" Oct 03 15:02:37 crc kubenswrapper[4774]: I1003 15:02:37.382579 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 15:02:37 crc kubenswrapper[4774]: I1003 15:02:37.384836 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 03 15:02:37 crc kubenswrapper[4774]: I1003 15:02:37.386092 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-29swj" Oct 03 15:02:37 crc kubenswrapper[4774]: I1003 15:02:37.401817 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 15:02:37 crc kubenswrapper[4774]: I1003 15:02:37.556676 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/747916db-cbde-4597-be0a-1e2034b1afca-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"747916db-cbde-4597-be0a-1e2034b1afca\") " pod="openstack/nova-cell0-conductor-0" Oct 03 15:02:37 crc kubenswrapper[4774]: I1003 15:02:37.556744 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/747916db-cbde-4597-be0a-1e2034b1afca-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"747916db-cbde-4597-be0a-1e2034b1afca\") " pod="openstack/nova-cell0-conductor-0" Oct 03 15:02:37 crc kubenswrapper[4774]: I1003 15:02:37.556857 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l2lt\" (UniqueName: \"kubernetes.io/projected/747916db-cbde-4597-be0a-1e2034b1afca-kube-api-access-8l2lt\") pod \"nova-cell0-conductor-0\" (UID: \"747916db-cbde-4597-be0a-1e2034b1afca\") " pod="openstack/nova-cell0-conductor-0" Oct 03 15:02:37 crc kubenswrapper[4774]: I1003 15:02:37.658725 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/747916db-cbde-4597-be0a-1e2034b1afca-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"747916db-cbde-4597-be0a-1e2034b1afca\") " pod="openstack/nova-cell0-conductor-0" Oct 03 15:02:37 crc kubenswrapper[4774]: I1003 15:02:37.658877 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/747916db-cbde-4597-be0a-1e2034b1afca-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"747916db-cbde-4597-be0a-1e2034b1afca\") " pod="openstack/nova-cell0-conductor-0" Oct 03 15:02:37 crc kubenswrapper[4774]: I1003 15:02:37.659020 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l2lt\" (UniqueName: \"kubernetes.io/projected/747916db-cbde-4597-be0a-1e2034b1afca-kube-api-access-8l2lt\") pod \"nova-cell0-conductor-0\" (UID: \"747916db-cbde-4597-be0a-1e2034b1afca\") " pod="openstack/nova-cell0-conductor-0" Oct 03 15:02:37 crc kubenswrapper[4774]: I1003 15:02:37.664141 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/747916db-cbde-4597-be0a-1e2034b1afca-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"747916db-cbde-4597-be0a-1e2034b1afca\") " pod="openstack/nova-cell0-conductor-0" Oct 03 15:02:37 crc kubenswrapper[4774]: I1003 15:02:37.665282 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/747916db-cbde-4597-be0a-1e2034b1afca-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"747916db-cbde-4597-be0a-1e2034b1afca\") " pod="openstack/nova-cell0-conductor-0" Oct 03 15:02:37 crc kubenswrapper[4774]: I1003 15:02:37.688678 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l2lt\" (UniqueName: \"kubernetes.io/projected/747916db-cbde-4597-be0a-1e2034b1afca-kube-api-access-8l2lt\") pod \"nova-cell0-conductor-0\" (UID: \"747916db-cbde-4597-be0a-1e2034b1afca\") " pod="openstack/nova-cell0-conductor-0" Oct 03 15:02:37 crc kubenswrapper[4774]: I1003 15:02:37.706793 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 15:02:38 crc kubenswrapper[4774]: I1003 15:02:38.212840 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 15:02:38 crc kubenswrapper[4774]: I1003 15:02:38.289061 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"747916db-cbde-4597-be0a-1e2034b1afca","Type":"ContainerStarted","Data":"1544aa419a26103f46a93e846754327c390a25e955cc4c5b94184f236bc66cfd"} Oct 03 15:02:39 crc kubenswrapper[4774]: I1003 15:02:39.330918 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"747916db-cbde-4597-be0a-1e2034b1afca","Type":"ContainerStarted","Data":"eea80a64f96c13a22c696d0ff452fc95c20194c543c1c38b81ee1521610fe1ef"} Oct 03 15:02:39 crc kubenswrapper[4774]: I1003 15:02:39.331225 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 03 15:02:39 crc kubenswrapper[4774]: I1003 15:02:39.389183 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.389160021 podStartE2EDuration="2.389160021s" podCreationTimestamp="2025-10-03 15:02:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:02:39.385717836 +0000 UTC m=+1181.974921298" watchObservedRunningTime="2025-10-03 15:02:39.389160021 +0000 UTC m=+1181.978363473" Oct 03 15:02:47 crc kubenswrapper[4774]: I1003 15:02:47.756040 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.235393 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-5fb4f"] Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.236733 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5fb4f" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.239571 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.239806 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.248781 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-5fb4f"] Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.366867 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77049c85-7aed-49f4-8dff-4a9a7a3a6b06-config-data\") pod \"nova-cell0-cell-mapping-5fb4f\" (UID: \"77049c85-7aed-49f4-8dff-4a9a7a3a6b06\") " pod="openstack/nova-cell0-cell-mapping-5fb4f" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.366927 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8szvm\" (UniqueName: \"kubernetes.io/projected/77049c85-7aed-49f4-8dff-4a9a7a3a6b06-kube-api-access-8szvm\") pod \"nova-cell0-cell-mapping-5fb4f\" (UID: \"77049c85-7aed-49f4-8dff-4a9a7a3a6b06\") " pod="openstack/nova-cell0-cell-mapping-5fb4f" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.366960 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77049c85-7aed-49f4-8dff-4a9a7a3a6b06-scripts\") pod \"nova-cell0-cell-mapping-5fb4f\" (UID: \"77049c85-7aed-49f4-8dff-4a9a7a3a6b06\") " pod="openstack/nova-cell0-cell-mapping-5fb4f" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.367096 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77049c85-7aed-49f4-8dff-4a9a7a3a6b06-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-5fb4f\" (UID: \"77049c85-7aed-49f4-8dff-4a9a7a3a6b06\") " pod="openstack/nova-cell0-cell-mapping-5fb4f" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.405433 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.407154 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.409525 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.430176 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.470301 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77049c85-7aed-49f4-8dff-4a9a7a3a6b06-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-5fb4f\" (UID: \"77049c85-7aed-49f4-8dff-4a9a7a3a6b06\") " pod="openstack/nova-cell0-cell-mapping-5fb4f" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.470433 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b379783f-0164-447b-a92f-408ff7901cea-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b379783f-0164-447b-a92f-408ff7901cea\") " pod="openstack/nova-scheduler-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.470469 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77049c85-7aed-49f4-8dff-4a9a7a3a6b06-config-data\") pod \"nova-cell0-cell-mapping-5fb4f\" (UID: \"77049c85-7aed-49f4-8dff-4a9a7a3a6b06\") " pod="openstack/nova-cell0-cell-mapping-5fb4f" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.470494 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8szvm\" (UniqueName: \"kubernetes.io/projected/77049c85-7aed-49f4-8dff-4a9a7a3a6b06-kube-api-access-8szvm\") pod \"nova-cell0-cell-mapping-5fb4f\" (UID: \"77049c85-7aed-49f4-8dff-4a9a7a3a6b06\") " pod="openstack/nova-cell0-cell-mapping-5fb4f" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.470512 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77049c85-7aed-49f4-8dff-4a9a7a3a6b06-scripts\") pod \"nova-cell0-cell-mapping-5fb4f\" (UID: \"77049c85-7aed-49f4-8dff-4a9a7a3a6b06\") " pod="openstack/nova-cell0-cell-mapping-5fb4f" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.470553 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b379783f-0164-447b-a92f-408ff7901cea-config-data\") pod \"nova-scheduler-0\" (UID: \"b379783f-0164-447b-a92f-408ff7901cea\") " pod="openstack/nova-scheduler-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.470580 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw68d\" (UniqueName: \"kubernetes.io/projected/b379783f-0164-447b-a92f-408ff7901cea-kube-api-access-xw68d\") pod \"nova-scheduler-0\" (UID: \"b379783f-0164-447b-a92f-408ff7901cea\") " pod="openstack/nova-scheduler-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.477147 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77049c85-7aed-49f4-8dff-4a9a7a3a6b06-config-data\") pod \"nova-cell0-cell-mapping-5fb4f\" (UID: \"77049c85-7aed-49f4-8dff-4a9a7a3a6b06\") " pod="openstack/nova-cell0-cell-mapping-5fb4f" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.479126 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77049c85-7aed-49f4-8dff-4a9a7a3a6b06-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-5fb4f\" (UID: \"77049c85-7aed-49f4-8dff-4a9a7a3a6b06\") " pod="openstack/nova-cell0-cell-mapping-5fb4f" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.482141 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77049c85-7aed-49f4-8dff-4a9a7a3a6b06-scripts\") pod \"nova-cell0-cell-mapping-5fb4f\" (UID: \"77049c85-7aed-49f4-8dff-4a9a7a3a6b06\") " pod="openstack/nova-cell0-cell-mapping-5fb4f" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.509887 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8szvm\" (UniqueName: \"kubernetes.io/projected/77049c85-7aed-49f4-8dff-4a9a7a3a6b06-kube-api-access-8szvm\") pod \"nova-cell0-cell-mapping-5fb4f\" (UID: \"77049c85-7aed-49f4-8dff-4a9a7a3a6b06\") " pod="openstack/nova-cell0-cell-mapping-5fb4f" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.525436 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.527119 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.536131 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.553654 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.555229 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.565882 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5fb4f" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.571029 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.572547 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b379783f-0164-447b-a92f-408ff7901cea-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b379783f-0164-447b-a92f-408ff7901cea\") " pod="openstack/nova-scheduler-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.572614 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b379783f-0164-447b-a92f-408ff7901cea-config-data\") pod \"nova-scheduler-0\" (UID: \"b379783f-0164-447b-a92f-408ff7901cea\") " pod="openstack/nova-scheduler-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.572641 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw68d\" (UniqueName: \"kubernetes.io/projected/b379783f-0164-447b-a92f-408ff7901cea-kube-api-access-xw68d\") pod \"nova-scheduler-0\" (UID: \"b379783f-0164-447b-a92f-408ff7901cea\") " pod="openstack/nova-scheduler-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.599259 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.614898 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b379783f-0164-447b-a92f-408ff7901cea-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b379783f-0164-447b-a92f-408ff7901cea\") " pod="openstack/nova-scheduler-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.615438 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b379783f-0164-447b-a92f-408ff7901cea-config-data\") pod \"nova-scheduler-0\" (UID: \"b379783f-0164-447b-a92f-408ff7901cea\") " pod="openstack/nova-scheduler-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.615469 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw68d\" (UniqueName: \"kubernetes.io/projected/b379783f-0164-447b-a92f-408ff7901cea-kube-api-access-xw68d\") pod \"nova-scheduler-0\" (UID: \"b379783f-0164-447b-a92f-408ff7901cea\") " pod="openstack/nova-scheduler-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.622446 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.652053 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.669499 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.671772 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.688452 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cee99571-3f9a-49ba-bced-bbb3e3a723e7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cee99571-3f9a-49ba-bced-bbb3e3a723e7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.688516 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76464c8b-55de-4a35-91ab-8cee9db23bf7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"76464c8b-55de-4a35-91ab-8cee9db23bf7\") " pod="openstack/nova-api-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.688546 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t25h6\" (UniqueName: \"kubernetes.io/projected/cee99571-3f9a-49ba-bced-bbb3e3a723e7-kube-api-access-t25h6\") pod \"nova-cell1-novncproxy-0\" (UID: \"cee99571-3f9a-49ba-bced-bbb3e3a723e7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.688580 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76464c8b-55de-4a35-91ab-8cee9db23bf7-config-data\") pod \"nova-api-0\" (UID: \"76464c8b-55de-4a35-91ab-8cee9db23bf7\") " pod="openstack/nova-api-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.688725 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7xc4\" (UniqueName: \"kubernetes.io/projected/76464c8b-55de-4a35-91ab-8cee9db23bf7-kube-api-access-r7xc4\") pod \"nova-api-0\" (UID: \"76464c8b-55de-4a35-91ab-8cee9db23bf7\") " pod="openstack/nova-api-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.688912 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76464c8b-55de-4a35-91ab-8cee9db23bf7-logs\") pod \"nova-api-0\" (UID: \"76464c8b-55de-4a35-91ab-8cee9db23bf7\") " pod="openstack/nova-api-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.688952 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cee99571-3f9a-49ba-bced-bbb3e3a723e7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cee99571-3f9a-49ba-bced-bbb3e3a723e7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.754499 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.759576 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.779282 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-jshth"] Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.781031 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-jshth" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.792488 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76464c8b-55de-4a35-91ab-8cee9db23bf7-logs\") pod \"nova-api-0\" (UID: \"76464c8b-55de-4a35-91ab-8cee9db23bf7\") " pod="openstack/nova-api-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.792539 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cee99571-3f9a-49ba-bced-bbb3e3a723e7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cee99571-3f9a-49ba-bced-bbb3e3a723e7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.792579 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51d6dfe6-9084-4868-93c7-58fe660a6d98-config-data\") pod \"nova-metadata-0\" (UID: \"51d6dfe6-9084-4868-93c7-58fe660a6d98\") " pod="openstack/nova-metadata-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.792625 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cee99571-3f9a-49ba-bced-bbb3e3a723e7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cee99571-3f9a-49ba-bced-bbb3e3a723e7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.792642 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76464c8b-55de-4a35-91ab-8cee9db23bf7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"76464c8b-55de-4a35-91ab-8cee9db23bf7\") " pod="openstack/nova-api-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.792659 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t25h6\" (UniqueName: \"kubernetes.io/projected/cee99571-3f9a-49ba-bced-bbb3e3a723e7-kube-api-access-t25h6\") pod \"nova-cell1-novncproxy-0\" (UID: \"cee99571-3f9a-49ba-bced-bbb3e3a723e7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.792674 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76464c8b-55de-4a35-91ab-8cee9db23bf7-config-data\") pod \"nova-api-0\" (UID: \"76464c8b-55de-4a35-91ab-8cee9db23bf7\") " pod="openstack/nova-api-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.792724 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7xc4\" (UniqueName: \"kubernetes.io/projected/76464c8b-55de-4a35-91ab-8cee9db23bf7-kube-api-access-r7xc4\") pod \"nova-api-0\" (UID: \"76464c8b-55de-4a35-91ab-8cee9db23bf7\") " pod="openstack/nova-api-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.792741 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d6dfe6-9084-4868-93c7-58fe660a6d98-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"51d6dfe6-9084-4868-93c7-58fe660a6d98\") " pod="openstack/nova-metadata-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.792763 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51d6dfe6-9084-4868-93c7-58fe660a6d98-logs\") pod \"nova-metadata-0\" (UID: \"51d6dfe6-9084-4868-93c7-58fe660a6d98\") " pod="openstack/nova-metadata-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.792790 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq4sg\" (UniqueName: \"kubernetes.io/projected/51d6dfe6-9084-4868-93c7-58fe660a6d98-kube-api-access-vq4sg\") pod \"nova-metadata-0\" (UID: \"51d6dfe6-9084-4868-93c7-58fe660a6d98\") " pod="openstack/nova-metadata-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.793186 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76464c8b-55de-4a35-91ab-8cee9db23bf7-logs\") pod \"nova-api-0\" (UID: \"76464c8b-55de-4a35-91ab-8cee9db23bf7\") " pod="openstack/nova-api-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.798656 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cee99571-3f9a-49ba-bced-bbb3e3a723e7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cee99571-3f9a-49ba-bced-bbb3e3a723e7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.799688 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-jshth"] Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.804134 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76464c8b-55de-4a35-91ab-8cee9db23bf7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"76464c8b-55de-4a35-91ab-8cee9db23bf7\") " pod="openstack/nova-api-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.805318 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cee99571-3f9a-49ba-bced-bbb3e3a723e7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cee99571-3f9a-49ba-bced-bbb3e3a723e7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.807185 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76464c8b-55de-4a35-91ab-8cee9db23bf7-config-data\") pod \"nova-api-0\" (UID: \"76464c8b-55de-4a35-91ab-8cee9db23bf7\") " pod="openstack/nova-api-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.810129 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t25h6\" (UniqueName: \"kubernetes.io/projected/cee99571-3f9a-49ba-bced-bbb3e3a723e7-kube-api-access-t25h6\") pod \"nova-cell1-novncproxy-0\" (UID: \"cee99571-3f9a-49ba-bced-bbb3e3a723e7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.815267 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7xc4\" (UniqueName: \"kubernetes.io/projected/76464c8b-55de-4a35-91ab-8cee9db23bf7-kube-api-access-r7xc4\") pod \"nova-api-0\" (UID: \"76464c8b-55de-4a35-91ab-8cee9db23bf7\") " pod="openstack/nova-api-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.909292 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4786927-81ff-4f7e-9c17-558e01bf47fe-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-jshth\" (UID: \"a4786927-81ff-4f7e-9c17-558e01bf47fe\") " pod="openstack/dnsmasq-dns-845d6d6f59-jshth" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.909349 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51d6dfe6-9084-4868-93c7-58fe660a6d98-logs\") pod \"nova-metadata-0\" (UID: \"51d6dfe6-9084-4868-93c7-58fe660a6d98\") " pod="openstack/nova-metadata-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.909442 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4pk2\" (UniqueName: \"kubernetes.io/projected/a4786927-81ff-4f7e-9c17-558e01bf47fe-kube-api-access-m4pk2\") pod \"dnsmasq-dns-845d6d6f59-jshth\" (UID: \"a4786927-81ff-4f7e-9c17-558e01bf47fe\") " pod="openstack/dnsmasq-dns-845d6d6f59-jshth" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.909502 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq4sg\" (UniqueName: \"kubernetes.io/projected/51d6dfe6-9084-4868-93c7-58fe660a6d98-kube-api-access-vq4sg\") pod \"nova-metadata-0\" (UID: \"51d6dfe6-9084-4868-93c7-58fe660a6d98\") " pod="openstack/nova-metadata-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.909778 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4786927-81ff-4f7e-9c17-558e01bf47fe-config\") pod \"dnsmasq-dns-845d6d6f59-jshth\" (UID: \"a4786927-81ff-4f7e-9c17-558e01bf47fe\") " pod="openstack/dnsmasq-dns-845d6d6f59-jshth" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.909835 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4786927-81ff-4f7e-9c17-558e01bf47fe-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-jshth\" (UID: \"a4786927-81ff-4f7e-9c17-558e01bf47fe\") " pod="openstack/dnsmasq-dns-845d6d6f59-jshth" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.909870 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51d6dfe6-9084-4868-93c7-58fe660a6d98-config-data\") pod \"nova-metadata-0\" (UID: \"51d6dfe6-9084-4868-93c7-58fe660a6d98\") " pod="openstack/nova-metadata-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.909891 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4786927-81ff-4f7e-9c17-558e01bf47fe-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-jshth\" (UID: \"a4786927-81ff-4f7e-9c17-558e01bf47fe\") " pod="openstack/dnsmasq-dns-845d6d6f59-jshth" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.909957 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4786927-81ff-4f7e-9c17-558e01bf47fe-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-jshth\" (UID: \"a4786927-81ff-4f7e-9c17-558e01bf47fe\") " pod="openstack/dnsmasq-dns-845d6d6f59-jshth" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.910147 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d6dfe6-9084-4868-93c7-58fe660a6d98-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"51d6dfe6-9084-4868-93c7-58fe660a6d98\") " pod="openstack/nova-metadata-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.914629 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51d6dfe6-9084-4868-93c7-58fe660a6d98-logs\") pod \"nova-metadata-0\" (UID: \"51d6dfe6-9084-4868-93c7-58fe660a6d98\") " pod="openstack/nova-metadata-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.916659 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d6dfe6-9084-4868-93c7-58fe660a6d98-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"51d6dfe6-9084-4868-93c7-58fe660a6d98\") " pod="openstack/nova-metadata-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.921244 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51d6dfe6-9084-4868-93c7-58fe660a6d98-config-data\") pod \"nova-metadata-0\" (UID: \"51d6dfe6-9084-4868-93c7-58fe660a6d98\") " pod="openstack/nova-metadata-0" Oct 03 15:02:48 crc kubenswrapper[4774]: I1003 15:02:48.934160 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq4sg\" (UniqueName: \"kubernetes.io/projected/51d6dfe6-9084-4868-93c7-58fe660a6d98-kube-api-access-vq4sg\") pod \"nova-metadata-0\" (UID: \"51d6dfe6-9084-4868-93c7-58fe660a6d98\") " pod="openstack/nova-metadata-0" Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.011397 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4786927-81ff-4f7e-9c17-558e01bf47fe-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-jshth\" (UID: \"a4786927-81ff-4f7e-9c17-558e01bf47fe\") " pod="openstack/dnsmasq-dns-845d6d6f59-jshth" Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.011519 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4786927-81ff-4f7e-9c17-558e01bf47fe-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-jshth\" (UID: \"a4786927-81ff-4f7e-9c17-558e01bf47fe\") " pod="openstack/dnsmasq-dns-845d6d6f59-jshth" Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.011558 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4pk2\" (UniqueName: \"kubernetes.io/projected/a4786927-81ff-4f7e-9c17-558e01bf47fe-kube-api-access-m4pk2\") pod \"dnsmasq-dns-845d6d6f59-jshth\" (UID: \"a4786927-81ff-4f7e-9c17-558e01bf47fe\") " pod="openstack/dnsmasq-dns-845d6d6f59-jshth" Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.011624 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4786927-81ff-4f7e-9c17-558e01bf47fe-config\") pod \"dnsmasq-dns-845d6d6f59-jshth\" (UID: \"a4786927-81ff-4f7e-9c17-558e01bf47fe\") " pod="openstack/dnsmasq-dns-845d6d6f59-jshth" Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.011647 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4786927-81ff-4f7e-9c17-558e01bf47fe-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-jshth\" (UID: \"a4786927-81ff-4f7e-9c17-558e01bf47fe\") " pod="openstack/dnsmasq-dns-845d6d6f59-jshth" Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.011670 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4786927-81ff-4f7e-9c17-558e01bf47fe-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-jshth\" (UID: \"a4786927-81ff-4f7e-9c17-558e01bf47fe\") " pod="openstack/dnsmasq-dns-845d6d6f59-jshth" Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.012690 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4786927-81ff-4f7e-9c17-558e01bf47fe-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-jshth\" (UID: \"a4786927-81ff-4f7e-9c17-558e01bf47fe\") " pod="openstack/dnsmasq-dns-845d6d6f59-jshth" Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.013142 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4786927-81ff-4f7e-9c17-558e01bf47fe-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-jshth\" (UID: \"a4786927-81ff-4f7e-9c17-558e01bf47fe\") " pod="openstack/dnsmasq-dns-845d6d6f59-jshth" Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.013584 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4786927-81ff-4f7e-9c17-558e01bf47fe-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-jshth\" (UID: \"a4786927-81ff-4f7e-9c17-558e01bf47fe\") " pod="openstack/dnsmasq-dns-845d6d6f59-jshth" Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.013898 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4786927-81ff-4f7e-9c17-558e01bf47fe-config\") pod \"dnsmasq-dns-845d6d6f59-jshth\" (UID: \"a4786927-81ff-4f7e-9c17-558e01bf47fe\") " pod="openstack/dnsmasq-dns-845d6d6f59-jshth" Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.014117 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4786927-81ff-4f7e-9c17-558e01bf47fe-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-jshth\" (UID: \"a4786927-81ff-4f7e-9c17-558e01bf47fe\") " pod="openstack/dnsmasq-dns-845d6d6f59-jshth" Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.034596 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4pk2\" (UniqueName: \"kubernetes.io/projected/a4786927-81ff-4f7e-9c17-558e01bf47fe-kube-api-access-m4pk2\") pod \"dnsmasq-dns-845d6d6f59-jshth\" (UID: \"a4786927-81ff-4f7e-9c17-558e01bf47fe\") " pod="openstack/dnsmasq-dns-845d6d6f59-jshth" Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.050424 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.075046 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.081935 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.110804 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-jshth" Oct 03 15:02:49 crc kubenswrapper[4774]: W1003 15:02:49.275032 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb379783f_0164_447b_a92f_408ff7901cea.slice/crio-954dbbc2324a332622ab560b9eb88c0a1013b51db8d07b486a8f3515ca618335 WatchSource:0}: Error finding container 954dbbc2324a332622ab560b9eb88c0a1013b51db8d07b486a8f3515ca618335: Status 404 returned error can't find the container with id 954dbbc2324a332622ab560b9eb88c0a1013b51db8d07b486a8f3515ca618335 Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.283848 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.360549 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zsswn"] Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.361827 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zsswn"] Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.361921 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zsswn" Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.363880 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.364326 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 03 15:02:49 crc kubenswrapper[4774]: W1003 15:02:49.380214 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77049c85_7aed_49f4_8dff_4a9a7a3a6b06.slice/crio-0b74d68ec031a8ed47a82a45ef679d786a9d79686286585373b1816ac7eed7c0 WatchSource:0}: Error finding container 0b74d68ec031a8ed47a82a45ef679d786a9d79686286585373b1816ac7eed7c0: Status 404 returned error can't find the container with id 0b74d68ec031a8ed47a82a45ef679d786a9d79686286585373b1816ac7eed7c0 Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.394423 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-5fb4f"] Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.460445 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5fb4f" event={"ID":"77049c85-7aed-49f4-8dff-4a9a7a3a6b06","Type":"ContainerStarted","Data":"0b74d68ec031a8ed47a82a45ef679d786a9d79686286585373b1816ac7eed7c0"} Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.462008 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b379783f-0164-447b-a92f-408ff7901cea","Type":"ContainerStarted","Data":"954dbbc2324a332622ab560b9eb88c0a1013b51db8d07b486a8f3515ca618335"} Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.527363 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c7e28d1-6897-4f5f-ad56-6036055365ad-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zsswn\" (UID: \"6c7e28d1-6897-4f5f-ad56-6036055365ad\") " pod="openstack/nova-cell1-conductor-db-sync-zsswn" Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.527450 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9qlb\" (UniqueName: \"kubernetes.io/projected/6c7e28d1-6897-4f5f-ad56-6036055365ad-kube-api-access-b9qlb\") pod \"nova-cell1-conductor-db-sync-zsswn\" (UID: \"6c7e28d1-6897-4f5f-ad56-6036055365ad\") " pod="openstack/nova-cell1-conductor-db-sync-zsswn" Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.527504 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c7e28d1-6897-4f5f-ad56-6036055365ad-scripts\") pod \"nova-cell1-conductor-db-sync-zsswn\" (UID: \"6c7e28d1-6897-4f5f-ad56-6036055365ad\") " pod="openstack/nova-cell1-conductor-db-sync-zsswn" Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.527543 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c7e28d1-6897-4f5f-ad56-6036055365ad-config-data\") pod \"nova-cell1-conductor-db-sync-zsswn\" (UID: \"6c7e28d1-6897-4f5f-ad56-6036055365ad\") " pod="openstack/nova-cell1-conductor-db-sync-zsswn" Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.611853 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 15:02:49 crc kubenswrapper[4774]: W1003 15:02:49.618654 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76464c8b_55de_4a35_91ab_8cee9db23bf7.slice/crio-3b177d13f217a9408fbf6d7a50f9f857bcf02e7fedff5439f601faa34883e513 WatchSource:0}: Error finding container 3b177d13f217a9408fbf6d7a50f9f857bcf02e7fedff5439f601faa34883e513: Status 404 returned error can't find the container with id 3b177d13f217a9408fbf6d7a50f9f857bcf02e7fedff5439f601faa34883e513 Oct 03 15:02:49 crc kubenswrapper[4774]: W1003 15:02:49.621650 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcee99571_3f9a_49ba_bced_bbb3e3a723e7.slice/crio-d9218f12af1f4b62b87429e5a9fa725c6a58ed2314b58761a08714084bf94ceb WatchSource:0}: Error finding container d9218f12af1f4b62b87429e5a9fa725c6a58ed2314b58761a08714084bf94ceb: Status 404 returned error can't find the container with id d9218f12af1f4b62b87429e5a9fa725c6a58ed2314b58761a08714084bf94ceb Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.624042 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.629583 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c7e28d1-6897-4f5f-ad56-6036055365ad-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zsswn\" (UID: \"6c7e28d1-6897-4f5f-ad56-6036055365ad\") " pod="openstack/nova-cell1-conductor-db-sync-zsswn" Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.629629 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9qlb\" (UniqueName: \"kubernetes.io/projected/6c7e28d1-6897-4f5f-ad56-6036055365ad-kube-api-access-b9qlb\") pod \"nova-cell1-conductor-db-sync-zsswn\" (UID: \"6c7e28d1-6897-4f5f-ad56-6036055365ad\") " pod="openstack/nova-cell1-conductor-db-sync-zsswn" Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.629683 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c7e28d1-6897-4f5f-ad56-6036055365ad-scripts\") pod \"nova-cell1-conductor-db-sync-zsswn\" (UID: \"6c7e28d1-6897-4f5f-ad56-6036055365ad\") " pod="openstack/nova-cell1-conductor-db-sync-zsswn" Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.629725 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c7e28d1-6897-4f5f-ad56-6036055365ad-config-data\") pod \"nova-cell1-conductor-db-sync-zsswn\" (UID: \"6c7e28d1-6897-4f5f-ad56-6036055365ad\") " pod="openstack/nova-cell1-conductor-db-sync-zsswn" Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.638330 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c7e28d1-6897-4f5f-ad56-6036055365ad-config-data\") pod \"nova-cell1-conductor-db-sync-zsswn\" (UID: \"6c7e28d1-6897-4f5f-ad56-6036055365ad\") " pod="openstack/nova-cell1-conductor-db-sync-zsswn" Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.640064 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c7e28d1-6897-4f5f-ad56-6036055365ad-scripts\") pod \"nova-cell1-conductor-db-sync-zsswn\" (UID: \"6c7e28d1-6897-4f5f-ad56-6036055365ad\") " pod="openstack/nova-cell1-conductor-db-sync-zsswn" Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.645731 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c7e28d1-6897-4f5f-ad56-6036055365ad-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zsswn\" (UID: \"6c7e28d1-6897-4f5f-ad56-6036055365ad\") " pod="openstack/nova-cell1-conductor-db-sync-zsswn" Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.647952 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9qlb\" (UniqueName: \"kubernetes.io/projected/6c7e28d1-6897-4f5f-ad56-6036055365ad-kube-api-access-b9qlb\") pod \"nova-cell1-conductor-db-sync-zsswn\" (UID: \"6c7e28d1-6897-4f5f-ad56-6036055365ad\") " pod="openstack/nova-cell1-conductor-db-sync-zsswn" Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.691273 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zsswn" Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.786710 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 15:02:49 crc kubenswrapper[4774]: W1003 15:02:49.814581 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51d6dfe6_9084_4868_93c7_58fe660a6d98.slice/crio-4205844f257ec77cf70f31bbd93545ab598e4858460720b4b87a6db8b74377c2 WatchSource:0}: Error finding container 4205844f257ec77cf70f31bbd93545ab598e4858460720b4b87a6db8b74377c2: Status 404 returned error can't find the container with id 4205844f257ec77cf70f31bbd93545ab598e4858460720b4b87a6db8b74377c2 Oct 03 15:02:49 crc kubenswrapper[4774]: I1003 15:02:49.881990 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-jshth"] Oct 03 15:02:50 crc kubenswrapper[4774]: I1003 15:02:50.139682 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zsswn"] Oct 03 15:02:50 crc kubenswrapper[4774]: W1003 15:02:50.159972 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c7e28d1_6897_4f5f_ad56_6036055365ad.slice/crio-c5d2eaa1980654cf584f83a8ce64672d208c93c9d0174397d78954737c9a4eaa WatchSource:0}: Error finding container c5d2eaa1980654cf584f83a8ce64672d208c93c9d0174397d78954737c9a4eaa: Status 404 returned error can't find the container with id c5d2eaa1980654cf584f83a8ce64672d208c93c9d0174397d78954737c9a4eaa Oct 03 15:02:50 crc kubenswrapper[4774]: I1003 15:02:50.472727 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"51d6dfe6-9084-4868-93c7-58fe660a6d98","Type":"ContainerStarted","Data":"4205844f257ec77cf70f31bbd93545ab598e4858460720b4b87a6db8b74377c2"} Oct 03 15:02:50 crc kubenswrapper[4774]: I1003 15:02:50.474705 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zsswn" event={"ID":"6c7e28d1-6897-4f5f-ad56-6036055365ad","Type":"ContainerStarted","Data":"ab49b1b43e1afd447c085cc1fb1a827df7e4610d24b0645fb5af172b3d241d8b"} Oct 03 15:02:50 crc kubenswrapper[4774]: I1003 15:02:50.474756 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zsswn" event={"ID":"6c7e28d1-6897-4f5f-ad56-6036055365ad","Type":"ContainerStarted","Data":"c5d2eaa1980654cf584f83a8ce64672d208c93c9d0174397d78954737c9a4eaa"} Oct 03 15:02:50 crc kubenswrapper[4774]: I1003 15:02:50.476573 4774 generic.go:334] "Generic (PLEG): container finished" podID="a4786927-81ff-4f7e-9c17-558e01bf47fe" containerID="36893f18d2527b640238e2e667734c422f32680734c79fa1d94a6ba50ec5f8b5" exitCode=0 Oct 03 15:02:50 crc kubenswrapper[4774]: I1003 15:02:50.476645 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-jshth" event={"ID":"a4786927-81ff-4f7e-9c17-558e01bf47fe","Type":"ContainerDied","Data":"36893f18d2527b640238e2e667734c422f32680734c79fa1d94a6ba50ec5f8b5"} Oct 03 15:02:50 crc kubenswrapper[4774]: I1003 15:02:50.476676 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-jshth" event={"ID":"a4786927-81ff-4f7e-9c17-558e01bf47fe","Type":"ContainerStarted","Data":"d04ef0c70f16a01663a74572f22767932b5fa4a623cb9c7a57afa61380f02f23"} Oct 03 15:02:50 crc kubenswrapper[4774]: I1003 15:02:50.480943 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cee99571-3f9a-49ba-bced-bbb3e3a723e7","Type":"ContainerStarted","Data":"d9218f12af1f4b62b87429e5a9fa725c6a58ed2314b58761a08714084bf94ceb"} Oct 03 15:02:50 crc kubenswrapper[4774]: I1003 15:02:50.482997 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"76464c8b-55de-4a35-91ab-8cee9db23bf7","Type":"ContainerStarted","Data":"3b177d13f217a9408fbf6d7a50f9f857bcf02e7fedff5439f601faa34883e513"} Oct 03 15:02:50 crc kubenswrapper[4774]: I1003 15:02:50.485866 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5fb4f" event={"ID":"77049c85-7aed-49f4-8dff-4a9a7a3a6b06","Type":"ContainerStarted","Data":"66c704702b10730f4806d6cc857a5fb52bea144f753d5d46ba8e711624b7b866"} Oct 03 15:02:50 crc kubenswrapper[4774]: I1003 15:02:50.500570 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-zsswn" podStartSLOduration=1.5005507740000001 podStartE2EDuration="1.500550774s" podCreationTimestamp="2025-10-03 15:02:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:02:50.492896234 +0000 UTC m=+1193.082099686" watchObservedRunningTime="2025-10-03 15:02:50.500550774 +0000 UTC m=+1193.089754226" Oct 03 15:02:50 crc kubenswrapper[4774]: I1003 15:02:50.546206 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-5fb4f" podStartSLOduration=2.546186465 podStartE2EDuration="2.546186465s" podCreationTimestamp="2025-10-03 15:02:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:02:50.539809107 +0000 UTC m=+1193.129012559" watchObservedRunningTime="2025-10-03 15:02:50.546186465 +0000 UTC m=+1193.135389917" Oct 03 15:02:50 crc kubenswrapper[4774]: I1003 15:02:50.655242 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:02:50 crc kubenswrapper[4774]: I1003 15:02:50.655337 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:02:51 crc kubenswrapper[4774]: I1003 15:02:51.496159 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-jshth" event={"ID":"a4786927-81ff-4f7e-9c17-558e01bf47fe","Type":"ContainerStarted","Data":"94938aa23d0960766d355e1b39181d660e7ad8202dcf755e1064d794eff42e3c"} Oct 03 15:02:51 crc kubenswrapper[4774]: I1003 15:02:51.516728 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-845d6d6f59-jshth" podStartSLOduration=3.516707659 podStartE2EDuration="3.516707659s" podCreationTimestamp="2025-10-03 15:02:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:02:51.514466304 +0000 UTC m=+1194.103669756" watchObservedRunningTime="2025-10-03 15:02:51.516707659 +0000 UTC m=+1194.105911111" Oct 03 15:02:52 crc kubenswrapper[4774]: I1003 15:02:52.505347 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-845d6d6f59-jshth" Oct 03 15:02:52 crc kubenswrapper[4774]: I1003 15:02:52.573933 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 15:02:52 crc kubenswrapper[4774]: I1003 15:02:52.611467 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 15:02:57 crc kubenswrapper[4774]: I1003 15:02:57.603316 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 03 15:02:58 crc kubenswrapper[4774]: I1003 15:02:58.570297 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cee99571-3f9a-49ba-bced-bbb3e3a723e7","Type":"ContainerStarted","Data":"ea00fdd4bff47426ae613c584b950e18dcd592788814ad50bf80cea80b9d70c9"} Oct 03 15:02:58 crc kubenswrapper[4774]: I1003 15:02:58.570386 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="cee99571-3f9a-49ba-bced-bbb3e3a723e7" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://ea00fdd4bff47426ae613c584b950e18dcd592788814ad50bf80cea80b9d70c9" gracePeriod=30 Oct 03 15:02:58 crc kubenswrapper[4774]: I1003 15:02:58.572800 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"76464c8b-55de-4a35-91ab-8cee9db23bf7","Type":"ContainerStarted","Data":"cb39e10eda03d794bacdb597f3b76a9eb8922ffe455b02d2805099bb5d135c90"} Oct 03 15:02:58 crc kubenswrapper[4774]: I1003 15:02:58.572849 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"76464c8b-55de-4a35-91ab-8cee9db23bf7","Type":"ContainerStarted","Data":"4a0e88dcb2bfde4646459d68166019721ef4badd66a6f7b4dca088147ef07308"} Oct 03 15:02:58 crc kubenswrapper[4774]: I1003 15:02:58.576716 4774 generic.go:334] "Generic (PLEG): container finished" podID="77049c85-7aed-49f4-8dff-4a9a7a3a6b06" containerID="66c704702b10730f4806d6cc857a5fb52bea144f753d5d46ba8e711624b7b866" exitCode=0 Oct 03 15:02:58 crc kubenswrapper[4774]: I1003 15:02:58.576806 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5fb4f" event={"ID":"77049c85-7aed-49f4-8dff-4a9a7a3a6b06","Type":"ContainerDied","Data":"66c704702b10730f4806d6cc857a5fb52bea144f753d5d46ba8e711624b7b866"} Oct 03 15:02:58 crc kubenswrapper[4774]: I1003 15:02:58.579254 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"51d6dfe6-9084-4868-93c7-58fe660a6d98","Type":"ContainerStarted","Data":"eac9131c35cd5caa6f3ef9ceed6d1189e16946d2f44e30acc67bf5439c02c878"} Oct 03 15:02:58 crc kubenswrapper[4774]: I1003 15:02:58.579291 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"51d6dfe6-9084-4868-93c7-58fe660a6d98","Type":"ContainerStarted","Data":"0dd503753713b1768b8ba808ac021116a0255cac49248d6391778074b18b80f8"} Oct 03 15:02:58 crc kubenswrapper[4774]: I1003 15:02:58.579325 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="51d6dfe6-9084-4868-93c7-58fe660a6d98" containerName="nova-metadata-metadata" containerID="cri-o://eac9131c35cd5caa6f3ef9ceed6d1189e16946d2f44e30acc67bf5439c02c878" gracePeriod=30 Oct 03 15:02:58 crc kubenswrapper[4774]: I1003 15:02:58.579320 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="51d6dfe6-9084-4868-93c7-58fe660a6d98" containerName="nova-metadata-log" containerID="cri-o://0dd503753713b1768b8ba808ac021116a0255cac49248d6391778074b18b80f8" gracePeriod=30 Oct 03 15:02:58 crc kubenswrapper[4774]: I1003 15:02:58.585037 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b379783f-0164-447b-a92f-408ff7901cea","Type":"ContainerStarted","Data":"0c608fd982706c9ee8fdc248ceb58bc434ede65e653c5cfc23898528594081fc"} Oct 03 15:02:58 crc kubenswrapper[4774]: I1003 15:02:58.602896 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.018628253 podStartE2EDuration="10.602870784s" podCreationTimestamp="2025-10-03 15:02:48 +0000 UTC" firstStartedPulling="2025-10-03 15:02:49.624619543 +0000 UTC m=+1192.213822995" lastFinishedPulling="2025-10-03 15:02:57.208862074 +0000 UTC m=+1199.798065526" observedRunningTime="2025-10-03 15:02:58.593240826 +0000 UTC m=+1201.182444288" watchObservedRunningTime="2025-10-03 15:02:58.602870784 +0000 UTC m=+1201.192074236" Oct 03 15:02:58 crc kubenswrapper[4774]: I1003 15:02:58.618676 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.233989863 podStartE2EDuration="10.618653086s" podCreationTimestamp="2025-10-03 15:02:48 +0000 UTC" firstStartedPulling="2025-10-03 15:02:49.821569455 +0000 UTC m=+1192.410772907" lastFinishedPulling="2025-10-03 15:02:57.206232678 +0000 UTC m=+1199.795436130" observedRunningTime="2025-10-03 15:02:58.612214966 +0000 UTC m=+1201.201418418" watchObservedRunningTime="2025-10-03 15:02:58.618653086 +0000 UTC m=+1201.207856538" Oct 03 15:02:58 crc kubenswrapper[4774]: I1003 15:02:58.656520 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.727900978 podStartE2EDuration="10.656500254s" podCreationTimestamp="2025-10-03 15:02:48 +0000 UTC" firstStartedPulling="2025-10-03 15:02:49.278541305 +0000 UTC m=+1191.867744757" lastFinishedPulling="2025-10-03 15:02:57.207140581 +0000 UTC m=+1199.796344033" observedRunningTime="2025-10-03 15:02:58.636948359 +0000 UTC m=+1201.226151811" watchObservedRunningTime="2025-10-03 15:02:58.656500254 +0000 UTC m=+1201.245703726" Oct 03 15:02:58 crc kubenswrapper[4774]: I1003 15:02:58.686760 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.057716313 podStartE2EDuration="10.686740003s" podCreationTimestamp="2025-10-03 15:02:48 +0000 UTC" firstStartedPulling="2025-10-03 15:02:49.620981053 +0000 UTC m=+1192.210184505" lastFinishedPulling="2025-10-03 15:02:57.250004753 +0000 UTC m=+1199.839208195" observedRunningTime="2025-10-03 15:02:58.67813516 +0000 UTC m=+1201.267338632" watchObservedRunningTime="2025-10-03 15:02:58.686740003 +0000 UTC m=+1201.275943455" Oct 03 15:02:58 crc kubenswrapper[4774]: I1003 15:02:58.755177 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 03 15:02:58 crc kubenswrapper[4774]: I1003 15:02:58.755427 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 03 15:02:58 crc kubenswrapper[4774]: I1003 15:02:58.783100 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.052167 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.072939 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.076225 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.076263 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.112622 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-845d6d6f59-jshth" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.170122 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-lvqzl"] Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.170634 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5784cf869f-lvqzl" podUID="900d9b8b-8106-44c5-a25a-db56b0639d7f" containerName="dnsmasq-dns" containerID="cri-o://19b2a88f3ad1e8d5bb7b074b82327f25e7eee7aa1dfd2f202b6eb0f21e8b545e" gracePeriod=10 Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.217135 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq4sg\" (UniqueName: \"kubernetes.io/projected/51d6dfe6-9084-4868-93c7-58fe660a6d98-kube-api-access-vq4sg\") pod \"51d6dfe6-9084-4868-93c7-58fe660a6d98\" (UID: \"51d6dfe6-9084-4868-93c7-58fe660a6d98\") " Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.217271 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51d6dfe6-9084-4868-93c7-58fe660a6d98-logs\") pod \"51d6dfe6-9084-4868-93c7-58fe660a6d98\" (UID: \"51d6dfe6-9084-4868-93c7-58fe660a6d98\") " Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.217323 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d6dfe6-9084-4868-93c7-58fe660a6d98-combined-ca-bundle\") pod \"51d6dfe6-9084-4868-93c7-58fe660a6d98\" (UID: \"51d6dfe6-9084-4868-93c7-58fe660a6d98\") " Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.217423 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51d6dfe6-9084-4868-93c7-58fe660a6d98-config-data\") pod \"51d6dfe6-9084-4868-93c7-58fe660a6d98\" (UID: \"51d6dfe6-9084-4868-93c7-58fe660a6d98\") " Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.219517 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51d6dfe6-9084-4868-93c7-58fe660a6d98-logs" (OuterVolumeSpecName: "logs") pod "51d6dfe6-9084-4868-93c7-58fe660a6d98" (UID: "51d6dfe6-9084-4868-93c7-58fe660a6d98"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.261614 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51d6dfe6-9084-4868-93c7-58fe660a6d98-kube-api-access-vq4sg" (OuterVolumeSpecName: "kube-api-access-vq4sg") pod "51d6dfe6-9084-4868-93c7-58fe660a6d98" (UID: "51d6dfe6-9084-4868-93c7-58fe660a6d98"). InnerVolumeSpecName "kube-api-access-vq4sg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.322877 4774 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51d6dfe6-9084-4868-93c7-58fe660a6d98-logs\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.323394 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vq4sg\" (UniqueName: \"kubernetes.io/projected/51d6dfe6-9084-4868-93c7-58fe660a6d98-kube-api-access-vq4sg\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.441320 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d6dfe6-9084-4868-93c7-58fe660a6d98-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51d6dfe6-9084-4868-93c7-58fe660a6d98" (UID: "51d6dfe6-9084-4868-93c7-58fe660a6d98"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.513635 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d6dfe6-9084-4868-93c7-58fe660a6d98-config-data" (OuterVolumeSpecName: "config-data") pod "51d6dfe6-9084-4868-93c7-58fe660a6d98" (UID: "51d6dfe6-9084-4868-93c7-58fe660a6d98"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.529984 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51d6dfe6-9084-4868-93c7-58fe660a6d98-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.530015 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d6dfe6-9084-4868-93c7-58fe660a6d98-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.613280 4774 generic.go:334] "Generic (PLEG): container finished" podID="51d6dfe6-9084-4868-93c7-58fe660a6d98" containerID="eac9131c35cd5caa6f3ef9ceed6d1189e16946d2f44e30acc67bf5439c02c878" exitCode=0 Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.613306 4774 generic.go:334] "Generic (PLEG): container finished" podID="51d6dfe6-9084-4868-93c7-58fe660a6d98" containerID="0dd503753713b1768b8ba808ac021116a0255cac49248d6391778074b18b80f8" exitCode=143 Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.613343 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"51d6dfe6-9084-4868-93c7-58fe660a6d98","Type":"ContainerDied","Data":"eac9131c35cd5caa6f3ef9ceed6d1189e16946d2f44e30acc67bf5439c02c878"} Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.613381 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"51d6dfe6-9084-4868-93c7-58fe660a6d98","Type":"ContainerDied","Data":"0dd503753713b1768b8ba808ac021116a0255cac49248d6391778074b18b80f8"} Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.613392 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"51d6dfe6-9084-4868-93c7-58fe660a6d98","Type":"ContainerDied","Data":"4205844f257ec77cf70f31bbd93545ab598e4858460720b4b87a6db8b74377c2"} Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.613406 4774 scope.go:117] "RemoveContainer" containerID="eac9131c35cd5caa6f3ef9ceed6d1189e16946d2f44e30acc67bf5439c02c878" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.613565 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.642558 4774 generic.go:334] "Generic (PLEG): container finished" podID="900d9b8b-8106-44c5-a25a-db56b0639d7f" containerID="19b2a88f3ad1e8d5bb7b074b82327f25e7eee7aa1dfd2f202b6eb0f21e8b545e" exitCode=0 Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.643320 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-lvqzl" event={"ID":"900d9b8b-8106-44c5-a25a-db56b0639d7f","Type":"ContainerDied","Data":"19b2a88f3ad1e8d5bb7b074b82327f25e7eee7aa1dfd2f202b6eb0f21e8b545e"} Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.670151 4774 scope.go:117] "RemoveContainer" containerID="0dd503753713b1768b8ba808ac021116a0255cac49248d6391778074b18b80f8" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.676463 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.696886 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.713064 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 15:02:59 crc kubenswrapper[4774]: E1003 15:02:59.713624 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51d6dfe6-9084-4868-93c7-58fe660a6d98" containerName="nova-metadata-log" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.713646 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="51d6dfe6-9084-4868-93c7-58fe660a6d98" containerName="nova-metadata-log" Oct 03 15:02:59 crc kubenswrapper[4774]: E1003 15:02:59.713681 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51d6dfe6-9084-4868-93c7-58fe660a6d98" containerName="nova-metadata-metadata" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.713691 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="51d6dfe6-9084-4868-93c7-58fe660a6d98" containerName="nova-metadata-metadata" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.713916 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="51d6dfe6-9084-4868-93c7-58fe660a6d98" containerName="nova-metadata-log" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.713938 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="51d6dfe6-9084-4868-93c7-58fe660a6d98" containerName="nova-metadata-metadata" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.715141 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.715578 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.720046 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.721525 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.723796 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.745650 4774 scope.go:117] "RemoveContainer" containerID="eac9131c35cd5caa6f3ef9ceed6d1189e16946d2f44e30acc67bf5439c02c878" Oct 03 15:02:59 crc kubenswrapper[4774]: E1003 15:02:59.746568 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eac9131c35cd5caa6f3ef9ceed6d1189e16946d2f44e30acc67bf5439c02c878\": container with ID starting with eac9131c35cd5caa6f3ef9ceed6d1189e16946d2f44e30acc67bf5439c02c878 not found: ID does not exist" containerID="eac9131c35cd5caa6f3ef9ceed6d1189e16946d2f44e30acc67bf5439c02c878" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.746603 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eac9131c35cd5caa6f3ef9ceed6d1189e16946d2f44e30acc67bf5439c02c878"} err="failed to get container status \"eac9131c35cd5caa6f3ef9ceed6d1189e16946d2f44e30acc67bf5439c02c878\": rpc error: code = NotFound desc = could not find container \"eac9131c35cd5caa6f3ef9ceed6d1189e16946d2f44e30acc67bf5439c02c878\": container with ID starting with eac9131c35cd5caa6f3ef9ceed6d1189e16946d2f44e30acc67bf5439c02c878 not found: ID does not exist" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.746631 4774 scope.go:117] "RemoveContainer" containerID="0dd503753713b1768b8ba808ac021116a0255cac49248d6391778074b18b80f8" Oct 03 15:02:59 crc kubenswrapper[4774]: E1003 15:02:59.746912 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dd503753713b1768b8ba808ac021116a0255cac49248d6391778074b18b80f8\": container with ID starting with 0dd503753713b1768b8ba808ac021116a0255cac49248d6391778074b18b80f8 not found: ID does not exist" containerID="0dd503753713b1768b8ba808ac021116a0255cac49248d6391778074b18b80f8" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.746929 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dd503753713b1768b8ba808ac021116a0255cac49248d6391778074b18b80f8"} err="failed to get container status \"0dd503753713b1768b8ba808ac021116a0255cac49248d6391778074b18b80f8\": rpc error: code = NotFound desc = could not find container \"0dd503753713b1768b8ba808ac021116a0255cac49248d6391778074b18b80f8\": container with ID starting with 0dd503753713b1768b8ba808ac021116a0255cac49248d6391778074b18b80f8 not found: ID does not exist" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.746941 4774 scope.go:117] "RemoveContainer" containerID="eac9131c35cd5caa6f3ef9ceed6d1189e16946d2f44e30acc67bf5439c02c878" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.747276 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eac9131c35cd5caa6f3ef9ceed6d1189e16946d2f44e30acc67bf5439c02c878"} err="failed to get container status \"eac9131c35cd5caa6f3ef9ceed6d1189e16946d2f44e30acc67bf5439c02c878\": rpc error: code = NotFound desc = could not find container \"eac9131c35cd5caa6f3ef9ceed6d1189e16946d2f44e30acc67bf5439c02c878\": container with ID starting with eac9131c35cd5caa6f3ef9ceed6d1189e16946d2f44e30acc67bf5439c02c878 not found: ID does not exist" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.747290 4774 scope.go:117] "RemoveContainer" containerID="0dd503753713b1768b8ba808ac021116a0255cac49248d6391778074b18b80f8" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.747493 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dd503753713b1768b8ba808ac021116a0255cac49248d6391778074b18b80f8"} err="failed to get container status \"0dd503753713b1768b8ba808ac021116a0255cac49248d6391778074b18b80f8\": rpc error: code = NotFound desc = could not find container \"0dd503753713b1768b8ba808ac021116a0255cac49248d6391778074b18b80f8\": container with ID starting with 0dd503753713b1768b8ba808ac021116a0255cac49248d6391778074b18b80f8 not found: ID does not exist" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.837762 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1a3949e-e232-492e-98fe-47a948f55f73-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c1a3949e-e232-492e-98fe-47a948f55f73\") " pod="openstack/nova-metadata-0" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.837818 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjfch\" (UniqueName: \"kubernetes.io/projected/c1a3949e-e232-492e-98fe-47a948f55f73-kube-api-access-mjfch\") pod \"nova-metadata-0\" (UID: \"c1a3949e-e232-492e-98fe-47a948f55f73\") " pod="openstack/nova-metadata-0" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.837894 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1a3949e-e232-492e-98fe-47a948f55f73-config-data\") pod \"nova-metadata-0\" (UID: \"c1a3949e-e232-492e-98fe-47a948f55f73\") " pod="openstack/nova-metadata-0" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.837938 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1a3949e-e232-492e-98fe-47a948f55f73-logs\") pod \"nova-metadata-0\" (UID: \"c1a3949e-e232-492e-98fe-47a948f55f73\") " pod="openstack/nova-metadata-0" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.838010 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1a3949e-e232-492e-98fe-47a948f55f73-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c1a3949e-e232-492e-98fe-47a948f55f73\") " pod="openstack/nova-metadata-0" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.864229 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-lvqzl" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.939509 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/900d9b8b-8106-44c5-a25a-db56b0639d7f-dns-svc\") pod \"900d9b8b-8106-44c5-a25a-db56b0639d7f\" (UID: \"900d9b8b-8106-44c5-a25a-db56b0639d7f\") " Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.939559 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/900d9b8b-8106-44c5-a25a-db56b0639d7f-config\") pod \"900d9b8b-8106-44c5-a25a-db56b0639d7f\" (UID: \"900d9b8b-8106-44c5-a25a-db56b0639d7f\") " Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.939674 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/900d9b8b-8106-44c5-a25a-db56b0639d7f-ovsdbserver-sb\") pod \"900d9b8b-8106-44c5-a25a-db56b0639d7f\" (UID: \"900d9b8b-8106-44c5-a25a-db56b0639d7f\") " Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.939753 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/900d9b8b-8106-44c5-a25a-db56b0639d7f-ovsdbserver-nb\") pod \"900d9b8b-8106-44c5-a25a-db56b0639d7f\" (UID: \"900d9b8b-8106-44c5-a25a-db56b0639d7f\") " Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.939777 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56c6f\" (UniqueName: \"kubernetes.io/projected/900d9b8b-8106-44c5-a25a-db56b0639d7f-kube-api-access-56c6f\") pod \"900d9b8b-8106-44c5-a25a-db56b0639d7f\" (UID: \"900d9b8b-8106-44c5-a25a-db56b0639d7f\") " Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.939797 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/900d9b8b-8106-44c5-a25a-db56b0639d7f-dns-swift-storage-0\") pod \"900d9b8b-8106-44c5-a25a-db56b0639d7f\" (UID: \"900d9b8b-8106-44c5-a25a-db56b0639d7f\") " Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.940081 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1a3949e-e232-492e-98fe-47a948f55f73-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c1a3949e-e232-492e-98fe-47a948f55f73\") " pod="openstack/nova-metadata-0" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.940102 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjfch\" (UniqueName: \"kubernetes.io/projected/c1a3949e-e232-492e-98fe-47a948f55f73-kube-api-access-mjfch\") pod \"nova-metadata-0\" (UID: \"c1a3949e-e232-492e-98fe-47a948f55f73\") " pod="openstack/nova-metadata-0" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.940155 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1a3949e-e232-492e-98fe-47a948f55f73-config-data\") pod \"nova-metadata-0\" (UID: \"c1a3949e-e232-492e-98fe-47a948f55f73\") " pod="openstack/nova-metadata-0" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.940185 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1a3949e-e232-492e-98fe-47a948f55f73-logs\") pod \"nova-metadata-0\" (UID: \"c1a3949e-e232-492e-98fe-47a948f55f73\") " pod="openstack/nova-metadata-0" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.940236 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1a3949e-e232-492e-98fe-47a948f55f73-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c1a3949e-e232-492e-98fe-47a948f55f73\") " pod="openstack/nova-metadata-0" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.946863 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1a3949e-e232-492e-98fe-47a948f55f73-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c1a3949e-e232-492e-98fe-47a948f55f73\") " pod="openstack/nova-metadata-0" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.947282 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1a3949e-e232-492e-98fe-47a948f55f73-logs\") pod \"nova-metadata-0\" (UID: \"c1a3949e-e232-492e-98fe-47a948f55f73\") " pod="openstack/nova-metadata-0" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.949097 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/900d9b8b-8106-44c5-a25a-db56b0639d7f-kube-api-access-56c6f" (OuterVolumeSpecName: "kube-api-access-56c6f") pod "900d9b8b-8106-44c5-a25a-db56b0639d7f" (UID: "900d9b8b-8106-44c5-a25a-db56b0639d7f"). InnerVolumeSpecName "kube-api-access-56c6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.952352 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1a3949e-e232-492e-98fe-47a948f55f73-config-data\") pod \"nova-metadata-0\" (UID: \"c1a3949e-e232-492e-98fe-47a948f55f73\") " pod="openstack/nova-metadata-0" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.962182 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1a3949e-e232-492e-98fe-47a948f55f73-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c1a3949e-e232-492e-98fe-47a948f55f73\") " pod="openstack/nova-metadata-0" Oct 03 15:02:59 crc kubenswrapper[4774]: I1003 15:02:59.987943 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjfch\" (UniqueName: \"kubernetes.io/projected/c1a3949e-e232-492e-98fe-47a948f55f73-kube-api-access-mjfch\") pod \"nova-metadata-0\" (UID: \"c1a3949e-e232-492e-98fe-47a948f55f73\") " pod="openstack/nova-metadata-0" Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.013216 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/900d9b8b-8106-44c5-a25a-db56b0639d7f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "900d9b8b-8106-44c5-a25a-db56b0639d7f" (UID: "900d9b8b-8106-44c5-a25a-db56b0639d7f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.035152 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/900d9b8b-8106-44c5-a25a-db56b0639d7f-config" (OuterVolumeSpecName: "config") pod "900d9b8b-8106-44c5-a25a-db56b0639d7f" (UID: "900d9b8b-8106-44c5-a25a-db56b0639d7f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.039223 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/900d9b8b-8106-44c5-a25a-db56b0639d7f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "900d9b8b-8106-44c5-a25a-db56b0639d7f" (UID: "900d9b8b-8106-44c5-a25a-db56b0639d7f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.043387 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56c6f\" (UniqueName: \"kubernetes.io/projected/900d9b8b-8106-44c5-a25a-db56b0639d7f-kube-api-access-56c6f\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.043406 4774 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/900d9b8b-8106-44c5-a25a-db56b0639d7f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.043417 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/900d9b8b-8106-44c5-a25a-db56b0639d7f-config\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.043427 4774 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/900d9b8b-8106-44c5-a25a-db56b0639d7f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.054410 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.070568 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/900d9b8b-8106-44c5-a25a-db56b0639d7f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "900d9b8b-8106-44c5-a25a-db56b0639d7f" (UID: "900d9b8b-8106-44c5-a25a-db56b0639d7f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.074816 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5fb4f" Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.087417 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/900d9b8b-8106-44c5-a25a-db56b0639d7f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "900d9b8b-8106-44c5-a25a-db56b0639d7f" (UID: "900d9b8b-8106-44c5-a25a-db56b0639d7f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.146117 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77049c85-7aed-49f4-8dff-4a9a7a3a6b06-config-data\") pod \"77049c85-7aed-49f4-8dff-4a9a7a3a6b06\" (UID: \"77049c85-7aed-49f4-8dff-4a9a7a3a6b06\") " Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.146202 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77049c85-7aed-49f4-8dff-4a9a7a3a6b06-scripts\") pod \"77049c85-7aed-49f4-8dff-4a9a7a3a6b06\" (UID: \"77049c85-7aed-49f4-8dff-4a9a7a3a6b06\") " Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.146244 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77049c85-7aed-49f4-8dff-4a9a7a3a6b06-combined-ca-bundle\") pod \"77049c85-7aed-49f4-8dff-4a9a7a3a6b06\" (UID: \"77049c85-7aed-49f4-8dff-4a9a7a3a6b06\") " Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.146315 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8szvm\" (UniqueName: \"kubernetes.io/projected/77049c85-7aed-49f4-8dff-4a9a7a3a6b06-kube-api-access-8szvm\") pod \"77049c85-7aed-49f4-8dff-4a9a7a3a6b06\" (UID: \"77049c85-7aed-49f4-8dff-4a9a7a3a6b06\") " Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.146739 4774 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/900d9b8b-8106-44c5-a25a-db56b0639d7f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.146756 4774 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/900d9b8b-8106-44c5-a25a-db56b0639d7f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.154546 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77049c85-7aed-49f4-8dff-4a9a7a3a6b06-kube-api-access-8szvm" (OuterVolumeSpecName: "kube-api-access-8szvm") pod "77049c85-7aed-49f4-8dff-4a9a7a3a6b06" (UID: "77049c85-7aed-49f4-8dff-4a9a7a3a6b06"). InnerVolumeSpecName "kube-api-access-8szvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.167098 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77049c85-7aed-49f4-8dff-4a9a7a3a6b06-scripts" (OuterVolumeSpecName: "scripts") pod "77049c85-7aed-49f4-8dff-4a9a7a3a6b06" (UID: "77049c85-7aed-49f4-8dff-4a9a7a3a6b06"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.167322 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="76464c8b-55de-4a35-91ab-8cee9db23bf7" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.189:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.167752 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="76464c8b-55de-4a35-91ab-8cee9db23bf7" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.189:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.216558 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77049c85-7aed-49f4-8dff-4a9a7a3a6b06-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "77049c85-7aed-49f4-8dff-4a9a7a3a6b06" (UID: "77049c85-7aed-49f4-8dff-4a9a7a3a6b06"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.220793 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77049c85-7aed-49f4-8dff-4a9a7a3a6b06-config-data" (OuterVolumeSpecName: "config-data") pod "77049c85-7aed-49f4-8dff-4a9a7a3a6b06" (UID: "77049c85-7aed-49f4-8dff-4a9a7a3a6b06"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.249145 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77049c85-7aed-49f4-8dff-4a9a7a3a6b06-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.249175 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77049c85-7aed-49f4-8dff-4a9a7a3a6b06-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.249184 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77049c85-7aed-49f4-8dff-4a9a7a3a6b06-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.249195 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8szvm\" (UniqueName: \"kubernetes.io/projected/77049c85-7aed-49f4-8dff-4a9a7a3a6b06-kube-api-access-8szvm\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.575172 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.660905 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5fb4f" event={"ID":"77049c85-7aed-49f4-8dff-4a9a7a3a6b06","Type":"ContainerDied","Data":"0b74d68ec031a8ed47a82a45ef679d786a9d79686286585373b1816ac7eed7c0"} Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.660972 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b74d68ec031a8ed47a82a45ef679d786a9d79686286585373b1816ac7eed7c0" Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.661055 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5fb4f" Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.698060 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-lvqzl" event={"ID":"900d9b8b-8106-44c5-a25a-db56b0639d7f","Type":"ContainerDied","Data":"6ea16e4d0e7a42989f3c5673a7df349930d3c139091e35d990d35f6db25d78ad"} Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.698115 4774 scope.go:117] "RemoveContainer" containerID="19b2a88f3ad1e8d5bb7b074b82327f25e7eee7aa1dfd2f202b6eb0f21e8b545e" Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.698314 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-lvqzl" Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.716417 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c1a3949e-e232-492e-98fe-47a948f55f73","Type":"ContainerStarted","Data":"07a03d55d2054a7881172ed076d932cd3cbd0f012fbf770f48570ea5cef4adfc"} Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.794492 4774 scope.go:117] "RemoveContainer" containerID="af096e38e623058c9881f6275cb8f7c8a76aba4c10e8869a317a763a6e7d1dd9" Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.810100 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-lvqzl"] Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.827010 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-lvqzl"] Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.860195 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.872228 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 15:03:00 crc kubenswrapper[4774]: I1003 15:03:00.885239 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 15:03:01 crc kubenswrapper[4774]: I1003 15:03:01.311241 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51d6dfe6-9084-4868-93c7-58fe660a6d98" path="/var/lib/kubelet/pods/51d6dfe6-9084-4868-93c7-58fe660a6d98/volumes" Oct 03 15:03:01 crc kubenswrapper[4774]: I1003 15:03:01.312493 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="900d9b8b-8106-44c5-a25a-db56b0639d7f" path="/var/lib/kubelet/pods/900d9b8b-8106-44c5-a25a-db56b0639d7f/volumes" Oct 03 15:03:01 crc kubenswrapper[4774]: I1003 15:03:01.725884 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c1a3949e-e232-492e-98fe-47a948f55f73","Type":"ContainerStarted","Data":"572248f1d44fd639c647128a8abe1059f52cc44382591ea6036c088066686884"} Oct 03 15:03:01 crc kubenswrapper[4774]: I1003 15:03:01.725931 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c1a3949e-e232-492e-98fe-47a948f55f73","Type":"ContainerStarted","Data":"bed249a2866a161e6b73cba119b1ce00685acd7b796346667016900a5afcf0e7"} Oct 03 15:03:01 crc kubenswrapper[4774]: I1003 15:03:01.726535 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c1a3949e-e232-492e-98fe-47a948f55f73" containerName="nova-metadata-metadata" containerID="cri-o://572248f1d44fd639c647128a8abe1059f52cc44382591ea6036c088066686884" gracePeriod=30 Oct 03 15:03:01 crc kubenswrapper[4774]: I1003 15:03:01.726655 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="76464c8b-55de-4a35-91ab-8cee9db23bf7" containerName="nova-api-log" containerID="cri-o://4a0e88dcb2bfde4646459d68166019721ef4badd66a6f7b4dca088147ef07308" gracePeriod=30 Oct 03 15:03:01 crc kubenswrapper[4774]: I1003 15:03:01.726750 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="76464c8b-55de-4a35-91ab-8cee9db23bf7" containerName="nova-api-api" containerID="cri-o://cb39e10eda03d794bacdb597f3b76a9eb8922ffe455b02d2805099bb5d135c90" gracePeriod=30 Oct 03 15:03:01 crc kubenswrapper[4774]: I1003 15:03:01.728124 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c1a3949e-e232-492e-98fe-47a948f55f73" containerName="nova-metadata-log" containerID="cri-o://bed249a2866a161e6b73cba119b1ce00685acd7b796346667016900a5afcf0e7" gracePeriod=30 Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.320818 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.390338 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1a3949e-e232-492e-98fe-47a948f55f73-config-data\") pod \"c1a3949e-e232-492e-98fe-47a948f55f73\" (UID: \"c1a3949e-e232-492e-98fe-47a948f55f73\") " Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.390420 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1a3949e-e232-492e-98fe-47a948f55f73-logs\") pod \"c1a3949e-e232-492e-98fe-47a948f55f73\" (UID: \"c1a3949e-e232-492e-98fe-47a948f55f73\") " Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.390595 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1a3949e-e232-492e-98fe-47a948f55f73-combined-ca-bundle\") pod \"c1a3949e-e232-492e-98fe-47a948f55f73\" (UID: \"c1a3949e-e232-492e-98fe-47a948f55f73\") " Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.390673 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1a3949e-e232-492e-98fe-47a948f55f73-nova-metadata-tls-certs\") pod \"c1a3949e-e232-492e-98fe-47a948f55f73\" (UID: \"c1a3949e-e232-492e-98fe-47a948f55f73\") " Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.390694 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjfch\" (UniqueName: \"kubernetes.io/projected/c1a3949e-e232-492e-98fe-47a948f55f73-kube-api-access-mjfch\") pod \"c1a3949e-e232-492e-98fe-47a948f55f73\" (UID: \"c1a3949e-e232-492e-98fe-47a948f55f73\") " Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.391887 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1a3949e-e232-492e-98fe-47a948f55f73-logs" (OuterVolumeSpecName: "logs") pod "c1a3949e-e232-492e-98fe-47a948f55f73" (UID: "c1a3949e-e232-492e-98fe-47a948f55f73"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.397572 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1a3949e-e232-492e-98fe-47a948f55f73-kube-api-access-mjfch" (OuterVolumeSpecName: "kube-api-access-mjfch") pod "c1a3949e-e232-492e-98fe-47a948f55f73" (UID: "c1a3949e-e232-492e-98fe-47a948f55f73"). InnerVolumeSpecName "kube-api-access-mjfch". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.418341 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1a3949e-e232-492e-98fe-47a948f55f73-config-data" (OuterVolumeSpecName: "config-data") pod "c1a3949e-e232-492e-98fe-47a948f55f73" (UID: "c1a3949e-e232-492e-98fe-47a948f55f73"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.424846 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1a3949e-e232-492e-98fe-47a948f55f73-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c1a3949e-e232-492e-98fe-47a948f55f73" (UID: "c1a3949e-e232-492e-98fe-47a948f55f73"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.469659 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1a3949e-e232-492e-98fe-47a948f55f73-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "c1a3949e-e232-492e-98fe-47a948f55f73" (UID: "c1a3949e-e232-492e-98fe-47a948f55f73"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.493187 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1a3949e-e232-492e-98fe-47a948f55f73-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.493232 4774 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1a3949e-e232-492e-98fe-47a948f55f73-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.493247 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjfch\" (UniqueName: \"kubernetes.io/projected/c1a3949e-e232-492e-98fe-47a948f55f73-kube-api-access-mjfch\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.493260 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1a3949e-e232-492e-98fe-47a948f55f73-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.493273 4774 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1a3949e-e232-492e-98fe-47a948f55f73-logs\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.736541 4774 generic.go:334] "Generic (PLEG): container finished" podID="76464c8b-55de-4a35-91ab-8cee9db23bf7" containerID="4a0e88dcb2bfde4646459d68166019721ef4badd66a6f7b4dca088147ef07308" exitCode=143 Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.736643 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"76464c8b-55de-4a35-91ab-8cee9db23bf7","Type":"ContainerDied","Data":"4a0e88dcb2bfde4646459d68166019721ef4badd66a6f7b4dca088147ef07308"} Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.739547 4774 generic.go:334] "Generic (PLEG): container finished" podID="c1a3949e-e232-492e-98fe-47a948f55f73" containerID="572248f1d44fd639c647128a8abe1059f52cc44382591ea6036c088066686884" exitCode=0 Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.739573 4774 generic.go:334] "Generic (PLEG): container finished" podID="c1a3949e-e232-492e-98fe-47a948f55f73" containerID="bed249a2866a161e6b73cba119b1ce00685acd7b796346667016900a5afcf0e7" exitCode=143 Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.739593 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c1a3949e-e232-492e-98fe-47a948f55f73","Type":"ContainerDied","Data":"572248f1d44fd639c647128a8abe1059f52cc44382591ea6036c088066686884"} Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.739638 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.739656 4774 scope.go:117] "RemoveContainer" containerID="572248f1d44fd639c647128a8abe1059f52cc44382591ea6036c088066686884" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.739642 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c1a3949e-e232-492e-98fe-47a948f55f73","Type":"ContainerDied","Data":"bed249a2866a161e6b73cba119b1ce00685acd7b796346667016900a5afcf0e7"} Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.739889 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c1a3949e-e232-492e-98fe-47a948f55f73","Type":"ContainerDied","Data":"07a03d55d2054a7881172ed076d932cd3cbd0f012fbf770f48570ea5cef4adfc"} Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.741705 4774 generic.go:334] "Generic (PLEG): container finished" podID="6c7e28d1-6897-4f5f-ad56-6036055365ad" containerID="ab49b1b43e1afd447c085cc1fb1a827df7e4610d24b0645fb5af172b3d241d8b" exitCode=0 Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.741813 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zsswn" event={"ID":"6c7e28d1-6897-4f5f-ad56-6036055365ad","Type":"ContainerDied","Data":"ab49b1b43e1afd447c085cc1fb1a827df7e4610d24b0645fb5af172b3d241d8b"} Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.741846 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b379783f-0164-447b-a92f-408ff7901cea" containerName="nova-scheduler-scheduler" containerID="cri-o://0c608fd982706c9ee8fdc248ceb58bc434ede65e653c5cfc23898528594081fc" gracePeriod=30 Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.763323 4774 scope.go:117] "RemoveContainer" containerID="bed249a2866a161e6b73cba119b1ce00685acd7b796346667016900a5afcf0e7" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.788883 4774 scope.go:117] "RemoveContainer" containerID="572248f1d44fd639c647128a8abe1059f52cc44382591ea6036c088066686884" Oct 03 15:03:02 crc kubenswrapper[4774]: E1003 15:03:02.789422 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"572248f1d44fd639c647128a8abe1059f52cc44382591ea6036c088066686884\": container with ID starting with 572248f1d44fd639c647128a8abe1059f52cc44382591ea6036c088066686884 not found: ID does not exist" containerID="572248f1d44fd639c647128a8abe1059f52cc44382591ea6036c088066686884" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.789465 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"572248f1d44fd639c647128a8abe1059f52cc44382591ea6036c088066686884"} err="failed to get container status \"572248f1d44fd639c647128a8abe1059f52cc44382591ea6036c088066686884\": rpc error: code = NotFound desc = could not find container \"572248f1d44fd639c647128a8abe1059f52cc44382591ea6036c088066686884\": container with ID starting with 572248f1d44fd639c647128a8abe1059f52cc44382591ea6036c088066686884 not found: ID does not exist" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.789491 4774 scope.go:117] "RemoveContainer" containerID="bed249a2866a161e6b73cba119b1ce00685acd7b796346667016900a5afcf0e7" Oct 03 15:03:02 crc kubenswrapper[4774]: E1003 15:03:02.789885 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bed249a2866a161e6b73cba119b1ce00685acd7b796346667016900a5afcf0e7\": container with ID starting with bed249a2866a161e6b73cba119b1ce00685acd7b796346667016900a5afcf0e7 not found: ID does not exist" containerID="bed249a2866a161e6b73cba119b1ce00685acd7b796346667016900a5afcf0e7" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.789963 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bed249a2866a161e6b73cba119b1ce00685acd7b796346667016900a5afcf0e7"} err="failed to get container status \"bed249a2866a161e6b73cba119b1ce00685acd7b796346667016900a5afcf0e7\": rpc error: code = NotFound desc = could not find container \"bed249a2866a161e6b73cba119b1ce00685acd7b796346667016900a5afcf0e7\": container with ID starting with bed249a2866a161e6b73cba119b1ce00685acd7b796346667016900a5afcf0e7 not found: ID does not exist" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.790033 4774 scope.go:117] "RemoveContainer" containerID="572248f1d44fd639c647128a8abe1059f52cc44382591ea6036c088066686884" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.790331 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"572248f1d44fd639c647128a8abe1059f52cc44382591ea6036c088066686884"} err="failed to get container status \"572248f1d44fd639c647128a8abe1059f52cc44382591ea6036c088066686884\": rpc error: code = NotFound desc = could not find container \"572248f1d44fd639c647128a8abe1059f52cc44382591ea6036c088066686884\": container with ID starting with 572248f1d44fd639c647128a8abe1059f52cc44382591ea6036c088066686884 not found: ID does not exist" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.790432 4774 scope.go:117] "RemoveContainer" containerID="bed249a2866a161e6b73cba119b1ce00685acd7b796346667016900a5afcf0e7" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.791703 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bed249a2866a161e6b73cba119b1ce00685acd7b796346667016900a5afcf0e7"} err="failed to get container status \"bed249a2866a161e6b73cba119b1ce00685acd7b796346667016900a5afcf0e7\": rpc error: code = NotFound desc = could not find container \"bed249a2866a161e6b73cba119b1ce00685acd7b796346667016900a5afcf0e7\": container with ID starting with bed249a2866a161e6b73cba119b1ce00685acd7b796346667016900a5afcf0e7 not found: ID does not exist" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.798215 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.808359 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.816729 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 15:03:02 crc kubenswrapper[4774]: E1003 15:03:02.817112 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1a3949e-e232-492e-98fe-47a948f55f73" containerName="nova-metadata-log" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.817133 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1a3949e-e232-492e-98fe-47a948f55f73" containerName="nova-metadata-log" Oct 03 15:03:02 crc kubenswrapper[4774]: E1003 15:03:02.817150 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="900d9b8b-8106-44c5-a25a-db56b0639d7f" containerName="init" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.817158 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="900d9b8b-8106-44c5-a25a-db56b0639d7f" containerName="init" Oct 03 15:03:02 crc kubenswrapper[4774]: E1003 15:03:02.817171 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="900d9b8b-8106-44c5-a25a-db56b0639d7f" containerName="dnsmasq-dns" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.817179 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="900d9b8b-8106-44c5-a25a-db56b0639d7f" containerName="dnsmasq-dns" Oct 03 15:03:02 crc kubenswrapper[4774]: E1003 15:03:02.817208 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1a3949e-e232-492e-98fe-47a948f55f73" containerName="nova-metadata-metadata" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.817216 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1a3949e-e232-492e-98fe-47a948f55f73" containerName="nova-metadata-metadata" Oct 03 15:03:02 crc kubenswrapper[4774]: E1003 15:03:02.817224 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77049c85-7aed-49f4-8dff-4a9a7a3a6b06" containerName="nova-manage" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.817230 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="77049c85-7aed-49f4-8dff-4a9a7a3a6b06" containerName="nova-manage" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.821789 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1a3949e-e232-492e-98fe-47a948f55f73" containerName="nova-metadata-log" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.821864 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1a3949e-e232-492e-98fe-47a948f55f73" containerName="nova-metadata-metadata" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.821899 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="900d9b8b-8106-44c5-a25a-db56b0639d7f" containerName="dnsmasq-dns" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.821916 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="77049c85-7aed-49f4-8dff-4a9a7a3a6b06" containerName="nova-manage" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.823272 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.835882 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.836062 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.837113 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.899675 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a062c05-477a-48c4-be23-ec0690970699-config-data\") pod \"nova-metadata-0\" (UID: \"2a062c05-477a-48c4-be23-ec0690970699\") " pod="openstack/nova-metadata-0" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.899729 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a062c05-477a-48c4-be23-ec0690970699-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2a062c05-477a-48c4-be23-ec0690970699\") " pod="openstack/nova-metadata-0" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.899769 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a062c05-477a-48c4-be23-ec0690970699-logs\") pod \"nova-metadata-0\" (UID: \"2a062c05-477a-48c4-be23-ec0690970699\") " pod="openstack/nova-metadata-0" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.899899 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a062c05-477a-48c4-be23-ec0690970699-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2a062c05-477a-48c4-be23-ec0690970699\") " pod="openstack/nova-metadata-0" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.899953 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcsdm\" (UniqueName: \"kubernetes.io/projected/2a062c05-477a-48c4-be23-ec0690970699-kube-api-access-xcsdm\") pod \"nova-metadata-0\" (UID: \"2a062c05-477a-48c4-be23-ec0690970699\") " pod="openstack/nova-metadata-0" Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.968772 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 15:03:02 crc kubenswrapper[4774]: I1003 15:03:02.969065 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="dc5407ed-2e7c-4f5e-b24a-f040659e71f1" containerName="kube-state-metrics" containerID="cri-o://80936cb28db012f2bafbf7c939b2e0b63ff5591ff3ad497d90dc6ef19897378c" gracePeriod=30 Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.001064 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a062c05-477a-48c4-be23-ec0690970699-config-data\") pod \"nova-metadata-0\" (UID: \"2a062c05-477a-48c4-be23-ec0690970699\") " pod="openstack/nova-metadata-0" Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.001150 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a062c05-477a-48c4-be23-ec0690970699-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2a062c05-477a-48c4-be23-ec0690970699\") " pod="openstack/nova-metadata-0" Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.001189 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a062c05-477a-48c4-be23-ec0690970699-logs\") pod \"nova-metadata-0\" (UID: \"2a062c05-477a-48c4-be23-ec0690970699\") " pod="openstack/nova-metadata-0" Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.001302 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a062c05-477a-48c4-be23-ec0690970699-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2a062c05-477a-48c4-be23-ec0690970699\") " pod="openstack/nova-metadata-0" Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.001344 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcsdm\" (UniqueName: \"kubernetes.io/projected/2a062c05-477a-48c4-be23-ec0690970699-kube-api-access-xcsdm\") pod \"nova-metadata-0\" (UID: \"2a062c05-477a-48c4-be23-ec0690970699\") " pod="openstack/nova-metadata-0" Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.002174 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a062c05-477a-48c4-be23-ec0690970699-logs\") pod \"nova-metadata-0\" (UID: \"2a062c05-477a-48c4-be23-ec0690970699\") " pod="openstack/nova-metadata-0" Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.012252 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a062c05-477a-48c4-be23-ec0690970699-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2a062c05-477a-48c4-be23-ec0690970699\") " pod="openstack/nova-metadata-0" Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.012643 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a062c05-477a-48c4-be23-ec0690970699-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2a062c05-477a-48c4-be23-ec0690970699\") " pod="openstack/nova-metadata-0" Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.012848 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a062c05-477a-48c4-be23-ec0690970699-config-data\") pod \"nova-metadata-0\" (UID: \"2a062c05-477a-48c4-be23-ec0690970699\") " pod="openstack/nova-metadata-0" Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.025945 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcsdm\" (UniqueName: \"kubernetes.io/projected/2a062c05-477a-48c4-be23-ec0690970699-kube-api-access-xcsdm\") pod \"nova-metadata-0\" (UID: \"2a062c05-477a-48c4-be23-ec0690970699\") " pod="openstack/nova-metadata-0" Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.152927 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.317775 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1a3949e-e232-492e-98fe-47a948f55f73" path="/var/lib/kubelet/pods/c1a3949e-e232-492e-98fe-47a948f55f73/volumes" Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.497354 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.610262 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjjfp\" (UniqueName: \"kubernetes.io/projected/dc5407ed-2e7c-4f5e-b24a-f040659e71f1-kube-api-access-sjjfp\") pod \"dc5407ed-2e7c-4f5e-b24a-f040659e71f1\" (UID: \"dc5407ed-2e7c-4f5e-b24a-f040659e71f1\") " Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.623565 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc5407ed-2e7c-4f5e-b24a-f040659e71f1-kube-api-access-sjjfp" (OuterVolumeSpecName: "kube-api-access-sjjfp") pod "dc5407ed-2e7c-4f5e-b24a-f040659e71f1" (UID: "dc5407ed-2e7c-4f5e-b24a-f040659e71f1"). InnerVolumeSpecName "kube-api-access-sjjfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.688431 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.713619 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjjfp\" (UniqueName: \"kubernetes.io/projected/dc5407ed-2e7c-4f5e-b24a-f040659e71f1-kube-api-access-sjjfp\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:03 crc kubenswrapper[4774]: E1003 15:03:03.755445 4774 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c608fd982706c9ee8fdc248ceb58bc434ede65e653c5cfc23898528594081fc is running failed: container process not found" containerID="0c608fd982706c9ee8fdc248ceb58bc434ede65e653c5cfc23898528594081fc" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 15:03:03 crc kubenswrapper[4774]: E1003 15:03:03.755819 4774 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c608fd982706c9ee8fdc248ceb58bc434ede65e653c5cfc23898528594081fc is running failed: container process not found" containerID="0c608fd982706c9ee8fdc248ceb58bc434ede65e653c5cfc23898528594081fc" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.762606 4774 generic.go:334] "Generic (PLEG): container finished" podID="dc5407ed-2e7c-4f5e-b24a-f040659e71f1" containerID="80936cb28db012f2bafbf7c939b2e0b63ff5591ff3ad497d90dc6ef19897378c" exitCode=2 Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.762684 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.762789 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"dc5407ed-2e7c-4f5e-b24a-f040659e71f1","Type":"ContainerDied","Data":"80936cb28db012f2bafbf7c939b2e0b63ff5591ff3ad497d90dc6ef19897378c"} Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.762826 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"dc5407ed-2e7c-4f5e-b24a-f040659e71f1","Type":"ContainerDied","Data":"c18661d3fffda6642bd4b7a66fd89d9daf7f14954054f1f1714bc4aafd7d7921"} Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.762846 4774 scope.go:117] "RemoveContainer" containerID="80936cb28db012f2bafbf7c939b2e0b63ff5591ff3ad497d90dc6ef19897378c" Oct 03 15:03:03 crc kubenswrapper[4774]: E1003 15:03:03.765420 4774 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c608fd982706c9ee8fdc248ceb58bc434ede65e653c5cfc23898528594081fc is running failed: container process not found" containerID="0c608fd982706c9ee8fdc248ceb58bc434ede65e653c5cfc23898528594081fc" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 15:03:03 crc kubenswrapper[4774]: E1003 15:03:03.765482 4774 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c608fd982706c9ee8fdc248ceb58bc434ede65e653c5cfc23898528594081fc is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="b379783f-0164-447b-a92f-408ff7901cea" containerName="nova-scheduler-scheduler" Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.835817 4774 scope.go:117] "RemoveContainer" containerID="80936cb28db012f2bafbf7c939b2e0b63ff5591ff3ad497d90dc6ef19897378c" Oct 03 15:03:03 crc kubenswrapper[4774]: E1003 15:03:03.836167 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80936cb28db012f2bafbf7c939b2e0b63ff5591ff3ad497d90dc6ef19897378c\": container with ID starting with 80936cb28db012f2bafbf7c939b2e0b63ff5591ff3ad497d90dc6ef19897378c not found: ID does not exist" containerID="80936cb28db012f2bafbf7c939b2e0b63ff5591ff3ad497d90dc6ef19897378c" Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.836194 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80936cb28db012f2bafbf7c939b2e0b63ff5591ff3ad497d90dc6ef19897378c"} err="failed to get container status \"80936cb28db012f2bafbf7c939b2e0b63ff5591ff3ad497d90dc6ef19897378c\": rpc error: code = NotFound desc = could not find container \"80936cb28db012f2bafbf7c939b2e0b63ff5591ff3ad497d90dc6ef19897378c\": container with ID starting with 80936cb28db012f2bafbf7c939b2e0b63ff5591ff3ad497d90dc6ef19897378c not found: ID does not exist" Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.849962 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.873604 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.883209 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 15:03:03 crc kubenswrapper[4774]: E1003 15:03:03.883855 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc5407ed-2e7c-4f5e-b24a-f040659e71f1" containerName="kube-state-metrics" Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.883886 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc5407ed-2e7c-4f5e-b24a-f040659e71f1" containerName="kube-state-metrics" Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.884118 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc5407ed-2e7c-4f5e-b24a-f040659e71f1" containerName="kube-state-metrics" Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.884984 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.890297 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.890598 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.902775 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.916519 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a584407c-d7f9-436b-a293-fe97f4ed3c78-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"a584407c-d7f9-436b-a293-fe97f4ed3c78\") " pod="openstack/kube-state-metrics-0" Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.916615 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a584407c-d7f9-436b-a293-fe97f4ed3c78-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"a584407c-d7f9-436b-a293-fe97f4ed3c78\") " pod="openstack/kube-state-metrics-0" Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.916636 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsc9q\" (UniqueName: \"kubernetes.io/projected/a584407c-d7f9-436b-a293-fe97f4ed3c78-kube-api-access-bsc9q\") pod \"kube-state-metrics-0\" (UID: \"a584407c-d7f9-436b-a293-fe97f4ed3c78\") " pod="openstack/kube-state-metrics-0" Oct 03 15:03:03 crc kubenswrapper[4774]: I1003 15:03:03.916657 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a584407c-d7f9-436b-a293-fe97f4ed3c78-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"a584407c-d7f9-436b-a293-fe97f4ed3c78\") " pod="openstack/kube-state-metrics-0" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.018027 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a584407c-d7f9-436b-a293-fe97f4ed3c78-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"a584407c-d7f9-436b-a293-fe97f4ed3c78\") " pod="openstack/kube-state-metrics-0" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.018587 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsc9q\" (UniqueName: \"kubernetes.io/projected/a584407c-d7f9-436b-a293-fe97f4ed3c78-kube-api-access-bsc9q\") pod \"kube-state-metrics-0\" (UID: \"a584407c-d7f9-436b-a293-fe97f4ed3c78\") " pod="openstack/kube-state-metrics-0" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.018614 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a584407c-d7f9-436b-a293-fe97f4ed3c78-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"a584407c-d7f9-436b-a293-fe97f4ed3c78\") " pod="openstack/kube-state-metrics-0" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.018639 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a584407c-d7f9-436b-a293-fe97f4ed3c78-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"a584407c-d7f9-436b-a293-fe97f4ed3c78\") " pod="openstack/kube-state-metrics-0" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.036895 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a584407c-d7f9-436b-a293-fe97f4ed3c78-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"a584407c-d7f9-436b-a293-fe97f4ed3c78\") " pod="openstack/kube-state-metrics-0" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.037466 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a584407c-d7f9-436b-a293-fe97f4ed3c78-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"a584407c-d7f9-436b-a293-fe97f4ed3c78\") " pod="openstack/kube-state-metrics-0" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.039079 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a584407c-d7f9-436b-a293-fe97f4ed3c78-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"a584407c-d7f9-436b-a293-fe97f4ed3c78\") " pod="openstack/kube-state-metrics-0" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.048413 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsc9q\" (UniqueName: \"kubernetes.io/projected/a584407c-d7f9-436b-a293-fe97f4ed3c78-kube-api-access-bsc9q\") pod \"kube-state-metrics-0\" (UID: \"a584407c-d7f9-436b-a293-fe97f4ed3c78\") " pod="openstack/kube-state-metrics-0" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.101108 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.173266 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zsswn" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.214954 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.221349 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c7e28d1-6897-4f5f-ad56-6036055365ad-scripts\") pod \"6c7e28d1-6897-4f5f-ad56-6036055365ad\" (UID: \"6c7e28d1-6897-4f5f-ad56-6036055365ad\") " Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.221472 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b379783f-0164-447b-a92f-408ff7901cea-config-data\") pod \"b379783f-0164-447b-a92f-408ff7901cea\" (UID: \"b379783f-0164-447b-a92f-408ff7901cea\") " Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.221534 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b379783f-0164-447b-a92f-408ff7901cea-combined-ca-bundle\") pod \"b379783f-0164-447b-a92f-408ff7901cea\" (UID: \"b379783f-0164-447b-a92f-408ff7901cea\") " Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.221561 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9qlb\" (UniqueName: \"kubernetes.io/projected/6c7e28d1-6897-4f5f-ad56-6036055365ad-kube-api-access-b9qlb\") pod \"6c7e28d1-6897-4f5f-ad56-6036055365ad\" (UID: \"6c7e28d1-6897-4f5f-ad56-6036055365ad\") " Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.221608 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xw68d\" (UniqueName: \"kubernetes.io/projected/b379783f-0164-447b-a92f-408ff7901cea-kube-api-access-xw68d\") pod \"b379783f-0164-447b-a92f-408ff7901cea\" (UID: \"b379783f-0164-447b-a92f-408ff7901cea\") " Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.221642 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c7e28d1-6897-4f5f-ad56-6036055365ad-config-data\") pod \"6c7e28d1-6897-4f5f-ad56-6036055365ad\" (UID: \"6c7e28d1-6897-4f5f-ad56-6036055365ad\") " Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.221738 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c7e28d1-6897-4f5f-ad56-6036055365ad-combined-ca-bundle\") pod \"6c7e28d1-6897-4f5f-ad56-6036055365ad\" (UID: \"6c7e28d1-6897-4f5f-ad56-6036055365ad\") " Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.231660 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b379783f-0164-447b-a92f-408ff7901cea-kube-api-access-xw68d" (OuterVolumeSpecName: "kube-api-access-xw68d") pod "b379783f-0164-447b-a92f-408ff7901cea" (UID: "b379783f-0164-447b-a92f-408ff7901cea"). InnerVolumeSpecName "kube-api-access-xw68d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.233872 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c7e28d1-6897-4f5f-ad56-6036055365ad-kube-api-access-b9qlb" (OuterVolumeSpecName: "kube-api-access-b9qlb") pod "6c7e28d1-6897-4f5f-ad56-6036055365ad" (UID: "6c7e28d1-6897-4f5f-ad56-6036055365ad"). InnerVolumeSpecName "kube-api-access-b9qlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.234016 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c7e28d1-6897-4f5f-ad56-6036055365ad-scripts" (OuterVolumeSpecName: "scripts") pod "6c7e28d1-6897-4f5f-ad56-6036055365ad" (UID: "6c7e28d1-6897-4f5f-ad56-6036055365ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.276637 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c7e28d1-6897-4f5f-ad56-6036055365ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c7e28d1-6897-4f5f-ad56-6036055365ad" (UID: "6c7e28d1-6897-4f5f-ad56-6036055365ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.283840 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b379783f-0164-447b-a92f-408ff7901cea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b379783f-0164-447b-a92f-408ff7901cea" (UID: "b379783f-0164-447b-a92f-408ff7901cea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.285297 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c7e28d1-6897-4f5f-ad56-6036055365ad-config-data" (OuterVolumeSpecName: "config-data") pod "6c7e28d1-6897-4f5f-ad56-6036055365ad" (UID: "6c7e28d1-6897-4f5f-ad56-6036055365ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.293286 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b379783f-0164-447b-a92f-408ff7901cea-config-data" (OuterVolumeSpecName: "config-data") pod "b379783f-0164-447b-a92f-408ff7901cea" (UID: "b379783f-0164-447b-a92f-408ff7901cea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.324758 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c7e28d1-6897-4f5f-ad56-6036055365ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.324801 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c7e28d1-6897-4f5f-ad56-6036055365ad-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.324815 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b379783f-0164-447b-a92f-408ff7901cea-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.324826 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b379783f-0164-447b-a92f-408ff7901cea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.324838 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9qlb\" (UniqueName: \"kubernetes.io/projected/6c7e28d1-6897-4f5f-ad56-6036055365ad-kube-api-access-b9qlb\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.324855 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xw68d\" (UniqueName: \"kubernetes.io/projected/b379783f-0164-447b-a92f-408ff7901cea-kube-api-access-xw68d\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.324867 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c7e28d1-6897-4f5f-ad56-6036055365ad-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.675542 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.782821 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zsswn" event={"ID":"6c7e28d1-6897-4f5f-ad56-6036055365ad","Type":"ContainerDied","Data":"c5d2eaa1980654cf584f83a8ce64672d208c93c9d0174397d78954737c9a4eaa"} Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.782894 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5d2eaa1980654cf584f83a8ce64672d208c93c9d0174397d78954737c9a4eaa" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.782991 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zsswn" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.790307 4774 generic.go:334] "Generic (PLEG): container finished" podID="b379783f-0164-447b-a92f-408ff7901cea" containerID="0c608fd982706c9ee8fdc248ceb58bc434ede65e653c5cfc23898528594081fc" exitCode=0 Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.790461 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.790483 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b379783f-0164-447b-a92f-408ff7901cea","Type":"ContainerDied","Data":"0c608fd982706c9ee8fdc248ceb58bc434ede65e653c5cfc23898528594081fc"} Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.790522 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b379783f-0164-447b-a92f-408ff7901cea","Type":"ContainerDied","Data":"954dbbc2324a332622ab560b9eb88c0a1013b51db8d07b486a8f3515ca618335"} Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.790543 4774 scope.go:117] "RemoveContainer" containerID="0c608fd982706c9ee8fdc248ceb58bc434ede65e653c5cfc23898528594081fc" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.793832 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2a062c05-477a-48c4-be23-ec0690970699","Type":"ContainerStarted","Data":"47ae8ed37e143ddafa7ded2644ff870b4905267ab7b74f7dbdf29f3f7faf9026"} Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.793864 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2a062c05-477a-48c4-be23-ec0690970699","Type":"ContainerStarted","Data":"b600f1ca5179462b2efcc9c62bea9d524a7cf73dbf7053485dbde0c034625316"} Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.793874 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2a062c05-477a-48c4-be23-ec0690970699","Type":"ContainerStarted","Data":"f307919fda045c366e28355b8dcf1587519c4775c86caaf112a4aa116ff59122"} Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.826192 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a584407c-d7f9-436b-a293-fe97f4ed3c78","Type":"ContainerStarted","Data":"e8b89a582b6a2bdb3476ece9b5b4fa4405a5182ef6b04e6160f5050b287fd32f"} Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.832295 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.832272685 podStartE2EDuration="2.832272685s" podCreationTimestamp="2025-10-03 15:03:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:03:04.821818836 +0000 UTC m=+1207.411022318" watchObservedRunningTime="2025-10-03 15:03:04.832272685 +0000 UTC m=+1207.421476147" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.856907 4774 scope.go:117] "RemoveContainer" containerID="0c608fd982706c9ee8fdc248ceb58bc434ede65e653c5cfc23898528594081fc" Oct 03 15:03:04 crc kubenswrapper[4774]: E1003 15:03:04.857673 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c608fd982706c9ee8fdc248ceb58bc434ede65e653c5cfc23898528594081fc\": container with ID starting with 0c608fd982706c9ee8fdc248ceb58bc434ede65e653c5cfc23898528594081fc not found: ID does not exist" containerID="0c608fd982706c9ee8fdc248ceb58bc434ede65e653c5cfc23898528594081fc" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.857710 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c608fd982706c9ee8fdc248ceb58bc434ede65e653c5cfc23898528594081fc"} err="failed to get container status \"0c608fd982706c9ee8fdc248ceb58bc434ede65e653c5cfc23898528594081fc\": rpc error: code = NotFound desc = could not find container \"0c608fd982706c9ee8fdc248ceb58bc434ede65e653c5cfc23898528594081fc\": container with ID starting with 0c608fd982706c9ee8fdc248ceb58bc434ede65e653c5cfc23898528594081fc not found: ID does not exist" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.859514 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 15:03:04 crc kubenswrapper[4774]: E1003 15:03:04.860207 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b379783f-0164-447b-a92f-408ff7901cea" containerName="nova-scheduler-scheduler" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.860238 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="b379783f-0164-447b-a92f-408ff7901cea" containerName="nova-scheduler-scheduler" Oct 03 15:03:04 crc kubenswrapper[4774]: E1003 15:03:04.860281 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c7e28d1-6897-4f5f-ad56-6036055365ad" containerName="nova-cell1-conductor-db-sync" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.860294 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c7e28d1-6897-4f5f-ad56-6036055365ad" containerName="nova-cell1-conductor-db-sync" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.860724 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c7e28d1-6897-4f5f-ad56-6036055365ad" containerName="nova-cell1-conductor-db-sync" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.860784 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="b379783f-0164-447b-a92f-408ff7901cea" containerName="nova-scheduler-scheduler" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.861601 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.864787 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.871025 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.883916 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.892996 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.911167 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.913792 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.916449 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.918655 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.940287 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18303598-0afd-48d0-a93a-a523807d8e37-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"18303598-0afd-48d0-a93a-a523807d8e37\") " pod="openstack/nova-cell1-conductor-0" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.940360 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18303598-0afd-48d0-a93a-a523807d8e37-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"18303598-0afd-48d0-a93a-a523807d8e37\") " pod="openstack/nova-cell1-conductor-0" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.940748 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgtl8\" (UniqueName: \"kubernetes.io/projected/18303598-0afd-48d0-a93a-a523807d8e37-kube-api-access-jgtl8\") pod \"nova-cell1-conductor-0\" (UID: \"18303598-0afd-48d0-a93a-a523807d8e37\") " pod="openstack/nova-cell1-conductor-0" Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.987687 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.988001 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="61b07b0f-462d-4e27-b251-85a8d869433a" containerName="ceilometer-central-agent" containerID="cri-o://b89bf2ce3a3e5e5815366e78eddaa47df6eb0e87baf85f18fcf762d1df57e0ef" gracePeriod=30 Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.988061 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="61b07b0f-462d-4e27-b251-85a8d869433a" containerName="sg-core" containerID="cri-o://596fe207acb42e3216e639196ebe78bef80eec721f99081ede5173ad751acc80" gracePeriod=30 Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.988083 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="61b07b0f-462d-4e27-b251-85a8d869433a" containerName="ceilometer-notification-agent" containerID="cri-o://a86fa64b9193a9e410ebadfa3c2adbe8f55679bfc01b57e812625095056114f9" gracePeriod=30 Oct 03 15:03:04 crc kubenswrapper[4774]: I1003 15:03:04.988104 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="61b07b0f-462d-4e27-b251-85a8d869433a" containerName="proxy-httpd" containerID="cri-o://e77bd534b5300e2440ac57b7cae4166aab2355bf4eb237feec3fc435c863d7a4" gracePeriod=30 Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.042337 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh5p4\" (UniqueName: \"kubernetes.io/projected/14f3b6b4-467a-4848-aed7-abeffc7767ad-kube-api-access-wh5p4\") pod \"nova-scheduler-0\" (UID: \"14f3b6b4-467a-4848-aed7-abeffc7767ad\") " pod="openstack/nova-scheduler-0" Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.042435 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18303598-0afd-48d0-a93a-a523807d8e37-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"18303598-0afd-48d0-a93a-a523807d8e37\") " pod="openstack/nova-cell1-conductor-0" Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.042460 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18303598-0afd-48d0-a93a-a523807d8e37-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"18303598-0afd-48d0-a93a-a523807d8e37\") " pod="openstack/nova-cell1-conductor-0" Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.042507 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14f3b6b4-467a-4848-aed7-abeffc7767ad-config-data\") pod \"nova-scheduler-0\" (UID: \"14f3b6b4-467a-4848-aed7-abeffc7767ad\") " pod="openstack/nova-scheduler-0" Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.042567 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgtl8\" (UniqueName: \"kubernetes.io/projected/18303598-0afd-48d0-a93a-a523807d8e37-kube-api-access-jgtl8\") pod \"nova-cell1-conductor-0\" (UID: \"18303598-0afd-48d0-a93a-a523807d8e37\") " pod="openstack/nova-cell1-conductor-0" Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.042591 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14f3b6b4-467a-4848-aed7-abeffc7767ad-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"14f3b6b4-467a-4848-aed7-abeffc7767ad\") " pod="openstack/nova-scheduler-0" Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.047725 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18303598-0afd-48d0-a93a-a523807d8e37-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"18303598-0afd-48d0-a93a-a523807d8e37\") " pod="openstack/nova-cell1-conductor-0" Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.047766 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18303598-0afd-48d0-a93a-a523807d8e37-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"18303598-0afd-48d0-a93a-a523807d8e37\") " pod="openstack/nova-cell1-conductor-0" Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.056928 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgtl8\" (UniqueName: \"kubernetes.io/projected/18303598-0afd-48d0-a93a-a523807d8e37-kube-api-access-jgtl8\") pod \"nova-cell1-conductor-0\" (UID: \"18303598-0afd-48d0-a93a-a523807d8e37\") " pod="openstack/nova-cell1-conductor-0" Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.143837 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh5p4\" (UniqueName: \"kubernetes.io/projected/14f3b6b4-467a-4848-aed7-abeffc7767ad-kube-api-access-wh5p4\") pod \"nova-scheduler-0\" (UID: \"14f3b6b4-467a-4848-aed7-abeffc7767ad\") " pod="openstack/nova-scheduler-0" Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.144059 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14f3b6b4-467a-4848-aed7-abeffc7767ad-config-data\") pod \"nova-scheduler-0\" (UID: \"14f3b6b4-467a-4848-aed7-abeffc7767ad\") " pod="openstack/nova-scheduler-0" Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.144189 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14f3b6b4-467a-4848-aed7-abeffc7767ad-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"14f3b6b4-467a-4848-aed7-abeffc7767ad\") " pod="openstack/nova-scheduler-0" Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.149121 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14f3b6b4-467a-4848-aed7-abeffc7767ad-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"14f3b6b4-467a-4848-aed7-abeffc7767ad\") " pod="openstack/nova-scheduler-0" Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.153316 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14f3b6b4-467a-4848-aed7-abeffc7767ad-config-data\") pod \"nova-scheduler-0\" (UID: \"14f3b6b4-467a-4848-aed7-abeffc7767ad\") " pod="openstack/nova-scheduler-0" Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.164876 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh5p4\" (UniqueName: \"kubernetes.io/projected/14f3b6b4-467a-4848-aed7-abeffc7767ad-kube-api-access-wh5p4\") pod \"nova-scheduler-0\" (UID: \"14f3b6b4-467a-4848-aed7-abeffc7767ad\") " pod="openstack/nova-scheduler-0" Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.188292 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.231019 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.316642 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b379783f-0164-447b-a92f-408ff7901cea" path="/var/lib/kubelet/pods/b379783f-0164-447b-a92f-408ff7901cea/volumes" Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.317439 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc5407ed-2e7c-4f5e-b24a-f040659e71f1" path="/var/lib/kubelet/pods/dc5407ed-2e7c-4f5e-b24a-f040659e71f1/volumes" Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.704058 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.800691 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.826474 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.836566 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"14f3b6b4-467a-4848-aed7-abeffc7767ad","Type":"ContainerStarted","Data":"a9b9725ee56e0d8287edcf441e2a0a0c0e93d6c5c1064d4a7479c1c5d46b91fb"} Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.840394 4774 generic.go:334] "Generic (PLEG): container finished" podID="61b07b0f-462d-4e27-b251-85a8d869433a" containerID="e77bd534b5300e2440ac57b7cae4166aab2355bf4eb237feec3fc435c863d7a4" exitCode=0 Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.840425 4774 generic.go:334] "Generic (PLEG): container finished" podID="61b07b0f-462d-4e27-b251-85a8d869433a" containerID="596fe207acb42e3216e639196ebe78bef80eec721f99081ede5173ad751acc80" exitCode=2 Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.840435 4774 generic.go:334] "Generic (PLEG): container finished" podID="61b07b0f-462d-4e27-b251-85a8d869433a" containerID="a86fa64b9193a9e410ebadfa3c2adbe8f55679bfc01b57e812625095056114f9" exitCode=0 Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.840444 4774 generic.go:334] "Generic (PLEG): container finished" podID="61b07b0f-462d-4e27-b251-85a8d869433a" containerID="b89bf2ce3a3e5e5815366e78eddaa47df6eb0e87baf85f18fcf762d1df57e0ef" exitCode=0 Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.840502 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61b07b0f-462d-4e27-b251-85a8d869433a","Type":"ContainerDied","Data":"e77bd534b5300e2440ac57b7cae4166aab2355bf4eb237feec3fc435c863d7a4"} Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.840526 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61b07b0f-462d-4e27-b251-85a8d869433a","Type":"ContainerDied","Data":"596fe207acb42e3216e639196ebe78bef80eec721f99081ede5173ad751acc80"} Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.840563 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61b07b0f-462d-4e27-b251-85a8d869433a","Type":"ContainerDied","Data":"a86fa64b9193a9e410ebadfa3c2adbe8f55679bfc01b57e812625095056114f9"} Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.840578 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61b07b0f-462d-4e27-b251-85a8d869433a","Type":"ContainerDied","Data":"b89bf2ce3a3e5e5815366e78eddaa47df6eb0e87baf85f18fcf762d1df57e0ef"} Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.840589 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61b07b0f-462d-4e27-b251-85a8d869433a","Type":"ContainerDied","Data":"5a51b2b323af232df00b84fc8bc22140278810604ea5f319086d8be90fdd51ba"} Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.840608 4774 scope.go:117] "RemoveContainer" containerID="e77bd534b5300e2440ac57b7cae4166aab2355bf4eb237feec3fc435c863d7a4" Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.840819 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.843767 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"18303598-0afd-48d0-a93a-a523807d8e37","Type":"ContainerStarted","Data":"ed292c69be0849e0d319b5401478130ba224a491e92e23807fc03f2b658d31ac"} Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.848236 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a584407c-d7f9-436b-a293-fe97f4ed3c78","Type":"ContainerStarted","Data":"1ca2a9e2fe76c9bafddaff0ee8143f832594107680f98d5b36be4ea05109d8da"} Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.848343 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.887423 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.492775496 podStartE2EDuration="2.887401706s" podCreationTimestamp="2025-10-03 15:03:03 +0000 UTC" firstStartedPulling="2025-10-03 15:03:04.685097707 +0000 UTC m=+1207.274301159" lastFinishedPulling="2025-10-03 15:03:05.079723917 +0000 UTC m=+1207.668927369" observedRunningTime="2025-10-03 15:03:05.879996893 +0000 UTC m=+1208.469200345" watchObservedRunningTime="2025-10-03 15:03:05.887401706 +0000 UTC m=+1208.476605158" Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.896072 4774 scope.go:117] "RemoveContainer" containerID="596fe207acb42e3216e639196ebe78bef80eec721f99081ede5173ad751acc80" Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.939578 4774 scope.go:117] "RemoveContainer" containerID="a86fa64b9193a9e410ebadfa3c2adbe8f55679bfc01b57e812625095056114f9" Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.967951 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkhsl\" (UniqueName: \"kubernetes.io/projected/61b07b0f-462d-4e27-b251-85a8d869433a-kube-api-access-xkhsl\") pod \"61b07b0f-462d-4e27-b251-85a8d869433a\" (UID: \"61b07b0f-462d-4e27-b251-85a8d869433a\") " Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.968037 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61b07b0f-462d-4e27-b251-85a8d869433a-sg-core-conf-yaml\") pod \"61b07b0f-462d-4e27-b251-85a8d869433a\" (UID: \"61b07b0f-462d-4e27-b251-85a8d869433a\") " Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.968070 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61b07b0f-462d-4e27-b251-85a8d869433a-log-httpd\") pod \"61b07b0f-462d-4e27-b251-85a8d869433a\" (UID: \"61b07b0f-462d-4e27-b251-85a8d869433a\") " Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.968099 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61b07b0f-462d-4e27-b251-85a8d869433a-run-httpd\") pod \"61b07b0f-462d-4e27-b251-85a8d869433a\" (UID: \"61b07b0f-462d-4e27-b251-85a8d869433a\") " Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.968136 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61b07b0f-462d-4e27-b251-85a8d869433a-config-data\") pod \"61b07b0f-462d-4e27-b251-85a8d869433a\" (UID: \"61b07b0f-462d-4e27-b251-85a8d869433a\") " Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.968173 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61b07b0f-462d-4e27-b251-85a8d869433a-combined-ca-bundle\") pod \"61b07b0f-462d-4e27-b251-85a8d869433a\" (UID: \"61b07b0f-462d-4e27-b251-85a8d869433a\") " Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.968264 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61b07b0f-462d-4e27-b251-85a8d869433a-scripts\") pod \"61b07b0f-462d-4e27-b251-85a8d869433a\" (UID: \"61b07b0f-462d-4e27-b251-85a8d869433a\") " Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.968718 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61b07b0f-462d-4e27-b251-85a8d869433a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "61b07b0f-462d-4e27-b251-85a8d869433a" (UID: "61b07b0f-462d-4e27-b251-85a8d869433a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.969358 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61b07b0f-462d-4e27-b251-85a8d869433a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "61b07b0f-462d-4e27-b251-85a8d869433a" (UID: "61b07b0f-462d-4e27-b251-85a8d869433a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:03:05 crc kubenswrapper[4774]: I1003 15:03:05.969915 4774 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61b07b0f-462d-4e27-b251-85a8d869433a-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:05.990194 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61b07b0f-462d-4e27-b251-85a8d869433a-scripts" (OuterVolumeSpecName: "scripts") pod "61b07b0f-462d-4e27-b251-85a8d869433a" (UID: "61b07b0f-462d-4e27-b251-85a8d869433a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:05.993669 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61b07b0f-462d-4e27-b251-85a8d869433a-kube-api-access-xkhsl" (OuterVolumeSpecName: "kube-api-access-xkhsl") pod "61b07b0f-462d-4e27-b251-85a8d869433a" (UID: "61b07b0f-462d-4e27-b251-85a8d869433a"). InnerVolumeSpecName "kube-api-access-xkhsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.011883 4774 scope.go:117] "RemoveContainer" containerID="b89bf2ce3a3e5e5815366e78eddaa47df6eb0e87baf85f18fcf762d1df57e0ef" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.028996 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61b07b0f-462d-4e27-b251-85a8d869433a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "61b07b0f-462d-4e27-b251-85a8d869433a" (UID: "61b07b0f-462d-4e27-b251-85a8d869433a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.073642 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61b07b0f-462d-4e27-b251-85a8d869433a-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.073671 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkhsl\" (UniqueName: \"kubernetes.io/projected/61b07b0f-462d-4e27-b251-85a8d869433a-kube-api-access-xkhsl\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.073683 4774 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61b07b0f-462d-4e27-b251-85a8d869433a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.073691 4774 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61b07b0f-462d-4e27-b251-85a8d869433a-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.101588 4774 scope.go:117] "RemoveContainer" containerID="e77bd534b5300e2440ac57b7cae4166aab2355bf4eb237feec3fc435c863d7a4" Oct 03 15:03:06 crc kubenswrapper[4774]: E1003 15:03:06.102533 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e77bd534b5300e2440ac57b7cae4166aab2355bf4eb237feec3fc435c863d7a4\": container with ID starting with e77bd534b5300e2440ac57b7cae4166aab2355bf4eb237feec3fc435c863d7a4 not found: ID does not exist" containerID="e77bd534b5300e2440ac57b7cae4166aab2355bf4eb237feec3fc435c863d7a4" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.102588 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e77bd534b5300e2440ac57b7cae4166aab2355bf4eb237feec3fc435c863d7a4"} err="failed to get container status \"e77bd534b5300e2440ac57b7cae4166aab2355bf4eb237feec3fc435c863d7a4\": rpc error: code = NotFound desc = could not find container \"e77bd534b5300e2440ac57b7cae4166aab2355bf4eb237feec3fc435c863d7a4\": container with ID starting with e77bd534b5300e2440ac57b7cae4166aab2355bf4eb237feec3fc435c863d7a4 not found: ID does not exist" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.102614 4774 scope.go:117] "RemoveContainer" containerID="596fe207acb42e3216e639196ebe78bef80eec721f99081ede5173ad751acc80" Oct 03 15:03:06 crc kubenswrapper[4774]: E1003 15:03:06.110920 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"596fe207acb42e3216e639196ebe78bef80eec721f99081ede5173ad751acc80\": container with ID starting with 596fe207acb42e3216e639196ebe78bef80eec721f99081ede5173ad751acc80 not found: ID does not exist" containerID="596fe207acb42e3216e639196ebe78bef80eec721f99081ede5173ad751acc80" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.110954 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"596fe207acb42e3216e639196ebe78bef80eec721f99081ede5173ad751acc80"} err="failed to get container status \"596fe207acb42e3216e639196ebe78bef80eec721f99081ede5173ad751acc80\": rpc error: code = NotFound desc = could not find container \"596fe207acb42e3216e639196ebe78bef80eec721f99081ede5173ad751acc80\": container with ID starting with 596fe207acb42e3216e639196ebe78bef80eec721f99081ede5173ad751acc80 not found: ID does not exist" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.110978 4774 scope.go:117] "RemoveContainer" containerID="a86fa64b9193a9e410ebadfa3c2adbe8f55679bfc01b57e812625095056114f9" Oct 03 15:03:06 crc kubenswrapper[4774]: E1003 15:03:06.111791 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a86fa64b9193a9e410ebadfa3c2adbe8f55679bfc01b57e812625095056114f9\": container with ID starting with a86fa64b9193a9e410ebadfa3c2adbe8f55679bfc01b57e812625095056114f9 not found: ID does not exist" containerID="a86fa64b9193a9e410ebadfa3c2adbe8f55679bfc01b57e812625095056114f9" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.111841 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a86fa64b9193a9e410ebadfa3c2adbe8f55679bfc01b57e812625095056114f9"} err="failed to get container status \"a86fa64b9193a9e410ebadfa3c2adbe8f55679bfc01b57e812625095056114f9\": rpc error: code = NotFound desc = could not find container \"a86fa64b9193a9e410ebadfa3c2adbe8f55679bfc01b57e812625095056114f9\": container with ID starting with a86fa64b9193a9e410ebadfa3c2adbe8f55679bfc01b57e812625095056114f9 not found: ID does not exist" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.111876 4774 scope.go:117] "RemoveContainer" containerID="b89bf2ce3a3e5e5815366e78eddaa47df6eb0e87baf85f18fcf762d1df57e0ef" Oct 03 15:03:06 crc kubenswrapper[4774]: E1003 15:03:06.112192 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b89bf2ce3a3e5e5815366e78eddaa47df6eb0e87baf85f18fcf762d1df57e0ef\": container with ID starting with b89bf2ce3a3e5e5815366e78eddaa47df6eb0e87baf85f18fcf762d1df57e0ef not found: ID does not exist" containerID="b89bf2ce3a3e5e5815366e78eddaa47df6eb0e87baf85f18fcf762d1df57e0ef" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.112213 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b89bf2ce3a3e5e5815366e78eddaa47df6eb0e87baf85f18fcf762d1df57e0ef"} err="failed to get container status \"b89bf2ce3a3e5e5815366e78eddaa47df6eb0e87baf85f18fcf762d1df57e0ef\": rpc error: code = NotFound desc = could not find container \"b89bf2ce3a3e5e5815366e78eddaa47df6eb0e87baf85f18fcf762d1df57e0ef\": container with ID starting with b89bf2ce3a3e5e5815366e78eddaa47df6eb0e87baf85f18fcf762d1df57e0ef not found: ID does not exist" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.112227 4774 scope.go:117] "RemoveContainer" containerID="e77bd534b5300e2440ac57b7cae4166aab2355bf4eb237feec3fc435c863d7a4" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.112558 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e77bd534b5300e2440ac57b7cae4166aab2355bf4eb237feec3fc435c863d7a4"} err="failed to get container status \"e77bd534b5300e2440ac57b7cae4166aab2355bf4eb237feec3fc435c863d7a4\": rpc error: code = NotFound desc = could not find container \"e77bd534b5300e2440ac57b7cae4166aab2355bf4eb237feec3fc435c863d7a4\": container with ID starting with e77bd534b5300e2440ac57b7cae4166aab2355bf4eb237feec3fc435c863d7a4 not found: ID does not exist" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.112577 4774 scope.go:117] "RemoveContainer" containerID="596fe207acb42e3216e639196ebe78bef80eec721f99081ede5173ad751acc80" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.112816 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"596fe207acb42e3216e639196ebe78bef80eec721f99081ede5173ad751acc80"} err="failed to get container status \"596fe207acb42e3216e639196ebe78bef80eec721f99081ede5173ad751acc80\": rpc error: code = NotFound desc = could not find container \"596fe207acb42e3216e639196ebe78bef80eec721f99081ede5173ad751acc80\": container with ID starting with 596fe207acb42e3216e639196ebe78bef80eec721f99081ede5173ad751acc80 not found: ID does not exist" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.112835 4774 scope.go:117] "RemoveContainer" containerID="a86fa64b9193a9e410ebadfa3c2adbe8f55679bfc01b57e812625095056114f9" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.113099 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a86fa64b9193a9e410ebadfa3c2adbe8f55679bfc01b57e812625095056114f9"} err="failed to get container status \"a86fa64b9193a9e410ebadfa3c2adbe8f55679bfc01b57e812625095056114f9\": rpc error: code = NotFound desc = could not find container \"a86fa64b9193a9e410ebadfa3c2adbe8f55679bfc01b57e812625095056114f9\": container with ID starting with a86fa64b9193a9e410ebadfa3c2adbe8f55679bfc01b57e812625095056114f9 not found: ID does not exist" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.113117 4774 scope.go:117] "RemoveContainer" containerID="b89bf2ce3a3e5e5815366e78eddaa47df6eb0e87baf85f18fcf762d1df57e0ef" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.113357 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b89bf2ce3a3e5e5815366e78eddaa47df6eb0e87baf85f18fcf762d1df57e0ef"} err="failed to get container status \"b89bf2ce3a3e5e5815366e78eddaa47df6eb0e87baf85f18fcf762d1df57e0ef\": rpc error: code = NotFound desc = could not find container \"b89bf2ce3a3e5e5815366e78eddaa47df6eb0e87baf85f18fcf762d1df57e0ef\": container with ID starting with b89bf2ce3a3e5e5815366e78eddaa47df6eb0e87baf85f18fcf762d1df57e0ef not found: ID does not exist" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.113392 4774 scope.go:117] "RemoveContainer" containerID="e77bd534b5300e2440ac57b7cae4166aab2355bf4eb237feec3fc435c863d7a4" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.113691 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e77bd534b5300e2440ac57b7cae4166aab2355bf4eb237feec3fc435c863d7a4"} err="failed to get container status \"e77bd534b5300e2440ac57b7cae4166aab2355bf4eb237feec3fc435c863d7a4\": rpc error: code = NotFound desc = could not find container \"e77bd534b5300e2440ac57b7cae4166aab2355bf4eb237feec3fc435c863d7a4\": container with ID starting with e77bd534b5300e2440ac57b7cae4166aab2355bf4eb237feec3fc435c863d7a4 not found: ID does not exist" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.113710 4774 scope.go:117] "RemoveContainer" containerID="596fe207acb42e3216e639196ebe78bef80eec721f99081ede5173ad751acc80" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.113977 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"596fe207acb42e3216e639196ebe78bef80eec721f99081ede5173ad751acc80"} err="failed to get container status \"596fe207acb42e3216e639196ebe78bef80eec721f99081ede5173ad751acc80\": rpc error: code = NotFound desc = could not find container \"596fe207acb42e3216e639196ebe78bef80eec721f99081ede5173ad751acc80\": container with ID starting with 596fe207acb42e3216e639196ebe78bef80eec721f99081ede5173ad751acc80 not found: ID does not exist" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.113993 4774 scope.go:117] "RemoveContainer" containerID="a86fa64b9193a9e410ebadfa3c2adbe8f55679bfc01b57e812625095056114f9" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.122814 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a86fa64b9193a9e410ebadfa3c2adbe8f55679bfc01b57e812625095056114f9"} err="failed to get container status \"a86fa64b9193a9e410ebadfa3c2adbe8f55679bfc01b57e812625095056114f9\": rpc error: code = NotFound desc = could not find container \"a86fa64b9193a9e410ebadfa3c2adbe8f55679bfc01b57e812625095056114f9\": container with ID starting with a86fa64b9193a9e410ebadfa3c2adbe8f55679bfc01b57e812625095056114f9 not found: ID does not exist" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.122867 4774 scope.go:117] "RemoveContainer" containerID="b89bf2ce3a3e5e5815366e78eddaa47df6eb0e87baf85f18fcf762d1df57e0ef" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.127643 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b89bf2ce3a3e5e5815366e78eddaa47df6eb0e87baf85f18fcf762d1df57e0ef"} err="failed to get container status \"b89bf2ce3a3e5e5815366e78eddaa47df6eb0e87baf85f18fcf762d1df57e0ef\": rpc error: code = NotFound desc = could not find container \"b89bf2ce3a3e5e5815366e78eddaa47df6eb0e87baf85f18fcf762d1df57e0ef\": container with ID starting with b89bf2ce3a3e5e5815366e78eddaa47df6eb0e87baf85f18fcf762d1df57e0ef not found: ID does not exist" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.127691 4774 scope.go:117] "RemoveContainer" containerID="e77bd534b5300e2440ac57b7cae4166aab2355bf4eb237feec3fc435c863d7a4" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.128587 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e77bd534b5300e2440ac57b7cae4166aab2355bf4eb237feec3fc435c863d7a4"} err="failed to get container status \"e77bd534b5300e2440ac57b7cae4166aab2355bf4eb237feec3fc435c863d7a4\": rpc error: code = NotFound desc = could not find container \"e77bd534b5300e2440ac57b7cae4166aab2355bf4eb237feec3fc435c863d7a4\": container with ID starting with e77bd534b5300e2440ac57b7cae4166aab2355bf4eb237feec3fc435c863d7a4 not found: ID does not exist" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.128688 4774 scope.go:117] "RemoveContainer" containerID="596fe207acb42e3216e639196ebe78bef80eec721f99081ede5173ad751acc80" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.128944 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"596fe207acb42e3216e639196ebe78bef80eec721f99081ede5173ad751acc80"} err="failed to get container status \"596fe207acb42e3216e639196ebe78bef80eec721f99081ede5173ad751acc80\": rpc error: code = NotFound desc = could not find container \"596fe207acb42e3216e639196ebe78bef80eec721f99081ede5173ad751acc80\": container with ID starting with 596fe207acb42e3216e639196ebe78bef80eec721f99081ede5173ad751acc80 not found: ID does not exist" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.129041 4774 scope.go:117] "RemoveContainer" containerID="a86fa64b9193a9e410ebadfa3c2adbe8f55679bfc01b57e812625095056114f9" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.129241 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a86fa64b9193a9e410ebadfa3c2adbe8f55679bfc01b57e812625095056114f9"} err="failed to get container status \"a86fa64b9193a9e410ebadfa3c2adbe8f55679bfc01b57e812625095056114f9\": rpc error: code = NotFound desc = could not find container \"a86fa64b9193a9e410ebadfa3c2adbe8f55679bfc01b57e812625095056114f9\": container with ID starting with a86fa64b9193a9e410ebadfa3c2adbe8f55679bfc01b57e812625095056114f9 not found: ID does not exist" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.129343 4774 scope.go:117] "RemoveContainer" containerID="b89bf2ce3a3e5e5815366e78eddaa47df6eb0e87baf85f18fcf762d1df57e0ef" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.134039 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b89bf2ce3a3e5e5815366e78eddaa47df6eb0e87baf85f18fcf762d1df57e0ef"} err="failed to get container status \"b89bf2ce3a3e5e5815366e78eddaa47df6eb0e87baf85f18fcf762d1df57e0ef\": rpc error: code = NotFound desc = could not find container \"b89bf2ce3a3e5e5815366e78eddaa47df6eb0e87baf85f18fcf762d1df57e0ef\": container with ID starting with b89bf2ce3a3e5e5815366e78eddaa47df6eb0e87baf85f18fcf762d1df57e0ef not found: ID does not exist" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.150045 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61b07b0f-462d-4e27-b251-85a8d869433a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61b07b0f-462d-4e27-b251-85a8d869433a" (UID: "61b07b0f-462d-4e27-b251-85a8d869433a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.175751 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61b07b0f-462d-4e27-b251-85a8d869433a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.188180 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61b07b0f-462d-4e27-b251-85a8d869433a-config-data" (OuterVolumeSpecName: "config-data") pod "61b07b0f-462d-4e27-b251-85a8d869433a" (UID: "61b07b0f-462d-4e27-b251-85a8d869433a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.281954 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61b07b0f-462d-4e27-b251-85a8d869433a-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.483138 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.494693 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.500834 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:03:06 crc kubenswrapper[4774]: E1003 15:03:06.501363 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61b07b0f-462d-4e27-b251-85a8d869433a" containerName="proxy-httpd" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.501389 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="61b07b0f-462d-4e27-b251-85a8d869433a" containerName="proxy-httpd" Oct 03 15:03:06 crc kubenswrapper[4774]: E1003 15:03:06.501408 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61b07b0f-462d-4e27-b251-85a8d869433a" containerName="sg-core" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.501415 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="61b07b0f-462d-4e27-b251-85a8d869433a" containerName="sg-core" Oct 03 15:03:06 crc kubenswrapper[4774]: E1003 15:03:06.501428 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61b07b0f-462d-4e27-b251-85a8d869433a" containerName="ceilometer-notification-agent" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.501434 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="61b07b0f-462d-4e27-b251-85a8d869433a" containerName="ceilometer-notification-agent" Oct 03 15:03:06 crc kubenswrapper[4774]: E1003 15:03:06.501444 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61b07b0f-462d-4e27-b251-85a8d869433a" containerName="ceilometer-central-agent" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.501451 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="61b07b0f-462d-4e27-b251-85a8d869433a" containerName="ceilometer-central-agent" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.501614 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="61b07b0f-462d-4e27-b251-85a8d869433a" containerName="ceilometer-notification-agent" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.501631 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="61b07b0f-462d-4e27-b251-85a8d869433a" containerName="sg-core" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.501646 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="61b07b0f-462d-4e27-b251-85a8d869433a" containerName="proxy-httpd" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.501656 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="61b07b0f-462d-4e27-b251-85a8d869433a" containerName="ceilometer-central-agent" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.503197 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.507191 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.507349 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.507562 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.519676 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.587791 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81e37fd3-224d-475c-bb4a-dd840dc3dd48-log-httpd\") pod \"ceilometer-0\" (UID: \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\") " pod="openstack/ceilometer-0" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.587859 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81e37fd3-224d-475c-bb4a-dd840dc3dd48-scripts\") pod \"ceilometer-0\" (UID: \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\") " pod="openstack/ceilometer-0" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.587896 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81e37fd3-224d-475c-bb4a-dd840dc3dd48-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\") " pod="openstack/ceilometer-0" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.587930 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfq4f\" (UniqueName: \"kubernetes.io/projected/81e37fd3-224d-475c-bb4a-dd840dc3dd48-kube-api-access-sfq4f\") pod \"ceilometer-0\" (UID: \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\") " pod="openstack/ceilometer-0" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.587957 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81e37fd3-224d-475c-bb4a-dd840dc3dd48-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\") " pod="openstack/ceilometer-0" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.588024 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81e37fd3-224d-475c-bb4a-dd840dc3dd48-run-httpd\") pod \"ceilometer-0\" (UID: \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\") " pod="openstack/ceilometer-0" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.588066 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/81e37fd3-224d-475c-bb4a-dd840dc3dd48-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\") " pod="openstack/ceilometer-0" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.588088 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81e37fd3-224d-475c-bb4a-dd840dc3dd48-config-data\") pod \"ceilometer-0\" (UID: \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\") " pod="openstack/ceilometer-0" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.645411 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.689719 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76464c8b-55de-4a35-91ab-8cee9db23bf7-logs\") pod \"76464c8b-55de-4a35-91ab-8cee9db23bf7\" (UID: \"76464c8b-55de-4a35-91ab-8cee9db23bf7\") " Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.689839 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7xc4\" (UniqueName: \"kubernetes.io/projected/76464c8b-55de-4a35-91ab-8cee9db23bf7-kube-api-access-r7xc4\") pod \"76464c8b-55de-4a35-91ab-8cee9db23bf7\" (UID: \"76464c8b-55de-4a35-91ab-8cee9db23bf7\") " Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.689919 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76464c8b-55de-4a35-91ab-8cee9db23bf7-combined-ca-bundle\") pod \"76464c8b-55de-4a35-91ab-8cee9db23bf7\" (UID: \"76464c8b-55de-4a35-91ab-8cee9db23bf7\") " Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.689972 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76464c8b-55de-4a35-91ab-8cee9db23bf7-config-data\") pod \"76464c8b-55de-4a35-91ab-8cee9db23bf7\" (UID: \"76464c8b-55de-4a35-91ab-8cee9db23bf7\") " Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.690207 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81e37fd3-224d-475c-bb4a-dd840dc3dd48-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\") " pod="openstack/ceilometer-0" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.690272 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81e37fd3-224d-475c-bb4a-dd840dc3dd48-run-httpd\") pod \"ceilometer-0\" (UID: \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\") " pod="openstack/ceilometer-0" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.690304 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/81e37fd3-224d-475c-bb4a-dd840dc3dd48-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\") " pod="openstack/ceilometer-0" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.690322 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81e37fd3-224d-475c-bb4a-dd840dc3dd48-config-data\") pod \"ceilometer-0\" (UID: \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\") " pod="openstack/ceilometer-0" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.690445 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81e37fd3-224d-475c-bb4a-dd840dc3dd48-log-httpd\") pod \"ceilometer-0\" (UID: \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\") " pod="openstack/ceilometer-0" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.690479 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81e37fd3-224d-475c-bb4a-dd840dc3dd48-scripts\") pod \"ceilometer-0\" (UID: \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\") " pod="openstack/ceilometer-0" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.690503 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81e37fd3-224d-475c-bb4a-dd840dc3dd48-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\") " pod="openstack/ceilometer-0" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.690526 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfq4f\" (UniqueName: \"kubernetes.io/projected/81e37fd3-224d-475c-bb4a-dd840dc3dd48-kube-api-access-sfq4f\") pod \"ceilometer-0\" (UID: \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\") " pod="openstack/ceilometer-0" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.691183 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76464c8b-55de-4a35-91ab-8cee9db23bf7-logs" (OuterVolumeSpecName: "logs") pod "76464c8b-55de-4a35-91ab-8cee9db23bf7" (UID: "76464c8b-55de-4a35-91ab-8cee9db23bf7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.692427 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81e37fd3-224d-475c-bb4a-dd840dc3dd48-log-httpd\") pod \"ceilometer-0\" (UID: \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\") " pod="openstack/ceilometer-0" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.693985 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81e37fd3-224d-475c-bb4a-dd840dc3dd48-run-httpd\") pod \"ceilometer-0\" (UID: \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\") " pod="openstack/ceilometer-0" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.695122 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/81e37fd3-224d-475c-bb4a-dd840dc3dd48-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\") " pod="openstack/ceilometer-0" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.697085 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81e37fd3-224d-475c-bb4a-dd840dc3dd48-config-data\") pod \"ceilometer-0\" (UID: \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\") " pod="openstack/ceilometer-0" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.700991 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81e37fd3-224d-475c-bb4a-dd840dc3dd48-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\") " pod="openstack/ceilometer-0" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.703290 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76464c8b-55de-4a35-91ab-8cee9db23bf7-kube-api-access-r7xc4" (OuterVolumeSpecName: "kube-api-access-r7xc4") pod "76464c8b-55de-4a35-91ab-8cee9db23bf7" (UID: "76464c8b-55de-4a35-91ab-8cee9db23bf7"). InnerVolumeSpecName "kube-api-access-r7xc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.704229 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81e37fd3-224d-475c-bb4a-dd840dc3dd48-scripts\") pod \"ceilometer-0\" (UID: \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\") " pod="openstack/ceilometer-0" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.720416 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfq4f\" (UniqueName: \"kubernetes.io/projected/81e37fd3-224d-475c-bb4a-dd840dc3dd48-kube-api-access-sfq4f\") pod \"ceilometer-0\" (UID: \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\") " pod="openstack/ceilometer-0" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.724785 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81e37fd3-224d-475c-bb4a-dd840dc3dd48-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\") " pod="openstack/ceilometer-0" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.727768 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76464c8b-55de-4a35-91ab-8cee9db23bf7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76464c8b-55de-4a35-91ab-8cee9db23bf7" (UID: "76464c8b-55de-4a35-91ab-8cee9db23bf7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.728600 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76464c8b-55de-4a35-91ab-8cee9db23bf7-config-data" (OuterVolumeSpecName: "config-data") pod "76464c8b-55de-4a35-91ab-8cee9db23bf7" (UID: "76464c8b-55de-4a35-91ab-8cee9db23bf7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.791292 4774 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76464c8b-55de-4a35-91ab-8cee9db23bf7-logs\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.791319 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7xc4\" (UniqueName: \"kubernetes.io/projected/76464c8b-55de-4a35-91ab-8cee9db23bf7-kube-api-access-r7xc4\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.791333 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76464c8b-55de-4a35-91ab-8cee9db23bf7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.791341 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76464c8b-55de-4a35-91ab-8cee9db23bf7-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.858023 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"18303598-0afd-48d0-a93a-a523807d8e37","Type":"ContainerStarted","Data":"68c7e12a3cef043686c82c381ea364215c45020eb7b92753e96f3c8426c5e34b"} Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.859089 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.866285 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"14f3b6b4-467a-4848-aed7-abeffc7767ad","Type":"ContainerStarted","Data":"1325cc23a9db1562b57aa0518519bba5423158658166edf999ebd9eee30d8fd2"} Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.869893 4774 generic.go:334] "Generic (PLEG): container finished" podID="76464c8b-55de-4a35-91ab-8cee9db23bf7" containerID="cb39e10eda03d794bacdb597f3b76a9eb8922ffe455b02d2805099bb5d135c90" exitCode=0 Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.870230 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.870218 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"76464c8b-55de-4a35-91ab-8cee9db23bf7","Type":"ContainerDied","Data":"cb39e10eda03d794bacdb597f3b76a9eb8922ffe455b02d2805099bb5d135c90"} Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.870364 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"76464c8b-55de-4a35-91ab-8cee9db23bf7","Type":"ContainerDied","Data":"3b177d13f217a9408fbf6d7a50f9f857bcf02e7fedff5439f601faa34883e513"} Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.870393 4774 scope.go:117] "RemoveContainer" containerID="cb39e10eda03d794bacdb597f3b76a9eb8922ffe455b02d2805099bb5d135c90" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.897288 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.897268227 podStartE2EDuration="2.897268227s" podCreationTimestamp="2025-10-03 15:03:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:03:06.879673651 +0000 UTC m=+1209.468877103" watchObservedRunningTime="2025-10-03 15:03:06.897268227 +0000 UTC m=+1209.486471679" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.908700 4774 scope.go:117] "RemoveContainer" containerID="4a0e88dcb2bfde4646459d68166019721ef4badd66a6f7b4dca088147ef07308" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.915215 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.915193341 podStartE2EDuration="2.915193341s" podCreationTimestamp="2025-10-03 15:03:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:03:06.91032043 +0000 UTC m=+1209.499523882" watchObservedRunningTime="2025-10-03 15:03:06.915193341 +0000 UTC m=+1209.504396803" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.937290 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.953155 4774 scope.go:117] "RemoveContainer" containerID="cb39e10eda03d794bacdb597f3b76a9eb8922ffe455b02d2805099bb5d135c90" Oct 03 15:03:06 crc kubenswrapper[4774]: E1003 15:03:06.954394 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb39e10eda03d794bacdb597f3b76a9eb8922ffe455b02d2805099bb5d135c90\": container with ID starting with cb39e10eda03d794bacdb597f3b76a9eb8922ffe455b02d2805099bb5d135c90 not found: ID does not exist" containerID="cb39e10eda03d794bacdb597f3b76a9eb8922ffe455b02d2805099bb5d135c90" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.954502 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb39e10eda03d794bacdb597f3b76a9eb8922ffe455b02d2805099bb5d135c90"} err="failed to get container status \"cb39e10eda03d794bacdb597f3b76a9eb8922ffe455b02d2805099bb5d135c90\": rpc error: code = NotFound desc = could not find container \"cb39e10eda03d794bacdb597f3b76a9eb8922ffe455b02d2805099bb5d135c90\": container with ID starting with cb39e10eda03d794bacdb597f3b76a9eb8922ffe455b02d2805099bb5d135c90 not found: ID does not exist" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.954576 4774 scope.go:117] "RemoveContainer" containerID="4a0e88dcb2bfde4646459d68166019721ef4badd66a6f7b4dca088147ef07308" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.954710 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 15:03:06 crc kubenswrapper[4774]: E1003 15:03:06.954947 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a0e88dcb2bfde4646459d68166019721ef4badd66a6f7b4dca088147ef07308\": container with ID starting with 4a0e88dcb2bfde4646459d68166019721ef4badd66a6f7b4dca088147ef07308 not found: ID does not exist" containerID="4a0e88dcb2bfde4646459d68166019721ef4badd66a6f7b4dca088147ef07308" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.955024 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a0e88dcb2bfde4646459d68166019721ef4badd66a6f7b4dca088147ef07308"} err="failed to get container status \"4a0e88dcb2bfde4646459d68166019721ef4badd66a6f7b4dca088147ef07308\": rpc error: code = NotFound desc = could not find container \"4a0e88dcb2bfde4646459d68166019721ef4badd66a6f7b4dca088147ef07308\": container with ID starting with 4a0e88dcb2bfde4646459d68166019721ef4badd66a6f7b4dca088147ef07308 not found: ID does not exist" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.963224 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.971780 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 15:03:06 crc kubenswrapper[4774]: E1003 15:03:06.972328 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76464c8b-55de-4a35-91ab-8cee9db23bf7" containerName="nova-api-log" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.972425 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="76464c8b-55de-4a35-91ab-8cee9db23bf7" containerName="nova-api-log" Oct 03 15:03:06 crc kubenswrapper[4774]: E1003 15:03:06.972525 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76464c8b-55de-4a35-91ab-8cee9db23bf7" containerName="nova-api-api" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.972591 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="76464c8b-55de-4a35-91ab-8cee9db23bf7" containerName="nova-api-api" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.972838 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="76464c8b-55de-4a35-91ab-8cee9db23bf7" containerName="nova-api-api" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.972928 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="76464c8b-55de-4a35-91ab-8cee9db23bf7" containerName="nova-api-log" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.977730 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.981163 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.995159 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf2fba80-784f-471a-a4ca-2437ab06c2cb-logs\") pod \"nova-api-0\" (UID: \"bf2fba80-784f-471a-a4ca-2437ab06c2cb\") " pod="openstack/nova-api-0" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.995437 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gcmk\" (UniqueName: \"kubernetes.io/projected/bf2fba80-784f-471a-a4ca-2437ab06c2cb-kube-api-access-7gcmk\") pod \"nova-api-0\" (UID: \"bf2fba80-784f-471a-a4ca-2437ab06c2cb\") " pod="openstack/nova-api-0" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.995569 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf2fba80-784f-471a-a4ca-2437ab06c2cb-config-data\") pod \"nova-api-0\" (UID: \"bf2fba80-784f-471a-a4ca-2437ab06c2cb\") " pod="openstack/nova-api-0" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.995680 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf2fba80-784f-471a-a4ca-2437ab06c2cb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bf2fba80-784f-471a-a4ca-2437ab06c2cb\") " pod="openstack/nova-api-0" Oct 03 15:03:06 crc kubenswrapper[4774]: I1003 15:03:06.998045 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 15:03:07 crc kubenswrapper[4774]: I1003 15:03:07.096764 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf2fba80-784f-471a-a4ca-2437ab06c2cb-logs\") pod \"nova-api-0\" (UID: \"bf2fba80-784f-471a-a4ca-2437ab06c2cb\") " pod="openstack/nova-api-0" Oct 03 15:03:07 crc kubenswrapper[4774]: I1003 15:03:07.096847 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gcmk\" (UniqueName: \"kubernetes.io/projected/bf2fba80-784f-471a-a4ca-2437ab06c2cb-kube-api-access-7gcmk\") pod \"nova-api-0\" (UID: \"bf2fba80-784f-471a-a4ca-2437ab06c2cb\") " pod="openstack/nova-api-0" Oct 03 15:03:07 crc kubenswrapper[4774]: I1003 15:03:07.096880 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf2fba80-784f-471a-a4ca-2437ab06c2cb-config-data\") pod \"nova-api-0\" (UID: \"bf2fba80-784f-471a-a4ca-2437ab06c2cb\") " pod="openstack/nova-api-0" Oct 03 15:03:07 crc kubenswrapper[4774]: I1003 15:03:07.096946 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf2fba80-784f-471a-a4ca-2437ab06c2cb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bf2fba80-784f-471a-a4ca-2437ab06c2cb\") " pod="openstack/nova-api-0" Oct 03 15:03:07 crc kubenswrapper[4774]: I1003 15:03:07.097215 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf2fba80-784f-471a-a4ca-2437ab06c2cb-logs\") pod \"nova-api-0\" (UID: \"bf2fba80-784f-471a-a4ca-2437ab06c2cb\") " pod="openstack/nova-api-0" Oct 03 15:03:07 crc kubenswrapper[4774]: I1003 15:03:07.101634 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf2fba80-784f-471a-a4ca-2437ab06c2cb-config-data\") pod \"nova-api-0\" (UID: \"bf2fba80-784f-471a-a4ca-2437ab06c2cb\") " pod="openstack/nova-api-0" Oct 03 15:03:07 crc kubenswrapper[4774]: I1003 15:03:07.101882 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf2fba80-784f-471a-a4ca-2437ab06c2cb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bf2fba80-784f-471a-a4ca-2437ab06c2cb\") " pod="openstack/nova-api-0" Oct 03 15:03:07 crc kubenswrapper[4774]: I1003 15:03:07.112208 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gcmk\" (UniqueName: \"kubernetes.io/projected/bf2fba80-784f-471a-a4ca-2437ab06c2cb-kube-api-access-7gcmk\") pod \"nova-api-0\" (UID: \"bf2fba80-784f-471a-a4ca-2437ab06c2cb\") " pod="openstack/nova-api-0" Oct 03 15:03:07 crc kubenswrapper[4774]: I1003 15:03:07.302620 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 15:03:07 crc kubenswrapper[4774]: I1003 15:03:07.341671 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61b07b0f-462d-4e27-b251-85a8d869433a" path="/var/lib/kubelet/pods/61b07b0f-462d-4e27-b251-85a8d869433a/volumes" Oct 03 15:03:07 crc kubenswrapper[4774]: I1003 15:03:07.343013 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76464c8b-55de-4a35-91ab-8cee9db23bf7" path="/var/lib/kubelet/pods/76464c8b-55de-4a35-91ab-8cee9db23bf7/volumes" Oct 03 15:03:07 crc kubenswrapper[4774]: I1003 15:03:07.406627 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:03:07 crc kubenswrapper[4774]: I1003 15:03:07.775267 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 15:03:07 crc kubenswrapper[4774]: I1003 15:03:07.880691 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81e37fd3-224d-475c-bb4a-dd840dc3dd48","Type":"ContainerStarted","Data":"cdfe3ed7bc6a828b89de06c8f8a4a652114f05dd1ea0dd7adbd31efbec7daba2"} Oct 03 15:03:07 crc kubenswrapper[4774]: I1003 15:03:07.882274 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bf2fba80-784f-471a-a4ca-2437ab06c2cb","Type":"ContainerStarted","Data":"e13dfc468812f37c077f8d9f4a1f0bc5528dcb81f7a617ebf93c0d6476c3783d"} Oct 03 15:03:08 crc kubenswrapper[4774]: I1003 15:03:08.155272 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 15:03:08 crc kubenswrapper[4774]: I1003 15:03:08.155342 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 15:03:08 crc kubenswrapper[4774]: I1003 15:03:08.897965 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bf2fba80-784f-471a-a4ca-2437ab06c2cb","Type":"ContainerStarted","Data":"e9506a626cd3852670069df872c66874a93031ebc3852911f346f884e9800645"} Oct 03 15:03:08 crc kubenswrapper[4774]: I1003 15:03:08.898472 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bf2fba80-784f-471a-a4ca-2437ab06c2cb","Type":"ContainerStarted","Data":"baa12dc79699561010471bd70a2484d5ff9f6cd6d74a289144312749d82b84e6"} Oct 03 15:03:08 crc kubenswrapper[4774]: I1003 15:03:08.901087 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81e37fd3-224d-475c-bb4a-dd840dc3dd48","Type":"ContainerStarted","Data":"71f6629c95f011d81aefc8a8aef6119ca3a8f22ace59b87dab2c29ee380c993a"} Oct 03 15:03:08 crc kubenswrapper[4774]: I1003 15:03:08.930300 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.9302779660000002 podStartE2EDuration="2.930277966s" podCreationTimestamp="2025-10-03 15:03:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:03:08.921759585 +0000 UTC m=+1211.510963087" watchObservedRunningTime="2025-10-03 15:03:08.930277966 +0000 UTC m=+1211.519481428" Oct 03 15:03:09 crc kubenswrapper[4774]: I1003 15:03:09.911243 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81e37fd3-224d-475c-bb4a-dd840dc3dd48","Type":"ContainerStarted","Data":"f58232fe6ee207596593c6637dbb4652f5612335c866ae4ea26e371fd5113d95"} Oct 03 15:03:10 crc kubenswrapper[4774]: I1003 15:03:10.217814 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 03 15:03:10 crc kubenswrapper[4774]: I1003 15:03:10.232341 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 03 15:03:10 crc kubenswrapper[4774]: I1003 15:03:10.932645 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81e37fd3-224d-475c-bb4a-dd840dc3dd48","Type":"ContainerStarted","Data":"36c6a230b9ae15d9fc5517f74a49cc53ec0cd157ee350e534d9c2cb4d877897a"} Oct 03 15:03:11 crc kubenswrapper[4774]: I1003 15:03:11.947730 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81e37fd3-224d-475c-bb4a-dd840dc3dd48","Type":"ContainerStarted","Data":"94174702c920dca8ed048feb8642fad03dbe1e88627679d2fe53e3fc3c4dbfcc"} Oct 03 15:03:11 crc kubenswrapper[4774]: I1003 15:03:11.948536 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 15:03:11 crc kubenswrapper[4774]: I1003 15:03:11.978297 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.9685732200000001 podStartE2EDuration="5.978273233s" podCreationTimestamp="2025-10-03 15:03:06 +0000 UTC" firstStartedPulling="2025-10-03 15:03:07.45939666 +0000 UTC m=+1210.048600112" lastFinishedPulling="2025-10-03 15:03:11.469096663 +0000 UTC m=+1214.058300125" observedRunningTime="2025-10-03 15:03:11.97327776 +0000 UTC m=+1214.562481242" watchObservedRunningTime="2025-10-03 15:03:11.978273233 +0000 UTC m=+1214.567476715" Oct 03 15:03:13 crc kubenswrapper[4774]: I1003 15:03:13.154716 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 15:03:13 crc kubenswrapper[4774]: I1003 15:03:13.154984 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 15:03:14 crc kubenswrapper[4774]: I1003 15:03:14.172760 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2a062c05-477a-48c4-be23-ec0690970699" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.194:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 15:03:14 crc kubenswrapper[4774]: I1003 15:03:14.172763 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2a062c05-477a-48c4-be23-ec0690970699" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.194:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 15:03:14 crc kubenswrapper[4774]: I1003 15:03:14.232762 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 03 15:03:15 crc kubenswrapper[4774]: I1003 15:03:15.232524 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 03 15:03:15 crc kubenswrapper[4774]: I1003 15:03:15.269061 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 03 15:03:16 crc kubenswrapper[4774]: I1003 15:03:16.025307 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 03 15:03:17 crc kubenswrapper[4774]: I1003 15:03:17.312991 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 15:03:17 crc kubenswrapper[4774]: I1003 15:03:17.313028 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 15:03:18 crc kubenswrapper[4774]: I1003 15:03:18.386821 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bf2fba80-784f-471a-a4ca-2437ab06c2cb" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.199:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 15:03:18 crc kubenswrapper[4774]: I1003 15:03:18.386784 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bf2fba80-784f-471a-a4ca-2437ab06c2cb" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.199:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 15:03:20 crc kubenswrapper[4774]: I1003 15:03:20.653765 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:03:20 crc kubenswrapper[4774]: I1003 15:03:20.654084 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:03:20 crc kubenswrapper[4774]: I1003 15:03:20.654140 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" Oct 03 15:03:20 crc kubenswrapper[4774]: I1003 15:03:20.655051 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3d631133d606feac7cf551b661ad83ff63af803d9385eff9dee1aa2b7ab7a1cd"} pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 15:03:20 crc kubenswrapper[4774]: I1003 15:03:20.655140 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" containerID="cri-o://3d631133d606feac7cf551b661ad83ff63af803d9385eff9dee1aa2b7ab7a1cd" gracePeriod=600 Oct 03 15:03:21 crc kubenswrapper[4774]: I1003 15:03:21.051221 4774 generic.go:334] "Generic (PLEG): container finished" podID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerID="3d631133d606feac7cf551b661ad83ff63af803d9385eff9dee1aa2b7ab7a1cd" exitCode=0 Oct 03 15:03:21 crc kubenswrapper[4774]: I1003 15:03:21.051313 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" event={"ID":"ca37ac4b-f421-4198-a179-12901d36f0f5","Type":"ContainerDied","Data":"3d631133d606feac7cf551b661ad83ff63af803d9385eff9dee1aa2b7ab7a1cd"} Oct 03 15:03:21 crc kubenswrapper[4774]: I1003 15:03:21.051951 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" event={"ID":"ca37ac4b-f421-4198-a179-12901d36f0f5","Type":"ContainerStarted","Data":"44818931129eda9720850f3c5b49565e0ee25d9e624e68b358f2ade90a0039e5"} Oct 03 15:03:21 crc kubenswrapper[4774]: I1003 15:03:21.051998 4774 scope.go:117] "RemoveContainer" containerID="f6858585d748d7516503bd2f90216465db181a380255963647c750b70d73b203" Oct 03 15:03:23 crc kubenswrapper[4774]: I1003 15:03:23.161711 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 15:03:23 crc kubenswrapper[4774]: I1003 15:03:23.163821 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 15:03:23 crc kubenswrapper[4774]: I1003 15:03:23.171294 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 15:03:23 crc kubenswrapper[4774]: I1003 15:03:23.174147 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 15:03:27 crc kubenswrapper[4774]: I1003 15:03:27.317500 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 15:03:27 crc kubenswrapper[4774]: I1003 15:03:27.318225 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 15:03:27 crc kubenswrapper[4774]: I1003 15:03:27.318670 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 15:03:27 crc kubenswrapper[4774]: I1003 15:03:27.318743 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 15:03:27 crc kubenswrapper[4774]: I1003 15:03:27.324236 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 15:03:27 crc kubenswrapper[4774]: I1003 15:03:27.331831 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 15:03:27 crc kubenswrapper[4774]: I1003 15:03:27.545077 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-wsntx"] Oct 03 15:03:27 crc kubenswrapper[4774]: I1003 15:03:27.547436 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-wsntx" Oct 03 15:03:27 crc kubenswrapper[4774]: I1003 15:03:27.559664 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-wsntx"] Oct 03 15:03:27 crc kubenswrapper[4774]: I1003 15:03:27.628066 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12ecba83-5e7f-4bec-a930-a02540cbde61-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-wsntx\" (UID: \"12ecba83-5e7f-4bec-a930-a02540cbde61\") " pod="openstack/dnsmasq-dns-59cf4bdb65-wsntx" Oct 03 15:03:27 crc kubenswrapper[4774]: I1003 15:03:27.628137 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/12ecba83-5e7f-4bec-a930-a02540cbde61-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-wsntx\" (UID: \"12ecba83-5e7f-4bec-a930-a02540cbde61\") " pod="openstack/dnsmasq-dns-59cf4bdb65-wsntx" Oct 03 15:03:27 crc kubenswrapper[4774]: I1003 15:03:27.628213 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12ecba83-5e7f-4bec-a930-a02540cbde61-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-wsntx\" (UID: \"12ecba83-5e7f-4bec-a930-a02540cbde61\") " pod="openstack/dnsmasq-dns-59cf4bdb65-wsntx" Oct 03 15:03:27 crc kubenswrapper[4774]: I1003 15:03:27.628363 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxqqw\" (UniqueName: \"kubernetes.io/projected/12ecba83-5e7f-4bec-a930-a02540cbde61-kube-api-access-sxqqw\") pod \"dnsmasq-dns-59cf4bdb65-wsntx\" (UID: \"12ecba83-5e7f-4bec-a930-a02540cbde61\") " pod="openstack/dnsmasq-dns-59cf4bdb65-wsntx" Oct 03 15:03:27 crc kubenswrapper[4774]: I1003 15:03:27.628568 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12ecba83-5e7f-4bec-a930-a02540cbde61-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-wsntx\" (UID: \"12ecba83-5e7f-4bec-a930-a02540cbde61\") " pod="openstack/dnsmasq-dns-59cf4bdb65-wsntx" Oct 03 15:03:27 crc kubenswrapper[4774]: I1003 15:03:27.628624 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12ecba83-5e7f-4bec-a930-a02540cbde61-config\") pod \"dnsmasq-dns-59cf4bdb65-wsntx\" (UID: \"12ecba83-5e7f-4bec-a930-a02540cbde61\") " pod="openstack/dnsmasq-dns-59cf4bdb65-wsntx" Oct 03 15:03:27 crc kubenswrapper[4774]: I1003 15:03:27.731398 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12ecba83-5e7f-4bec-a930-a02540cbde61-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-wsntx\" (UID: \"12ecba83-5e7f-4bec-a930-a02540cbde61\") " pod="openstack/dnsmasq-dns-59cf4bdb65-wsntx" Oct 03 15:03:27 crc kubenswrapper[4774]: I1003 15:03:27.731452 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12ecba83-5e7f-4bec-a930-a02540cbde61-config\") pod \"dnsmasq-dns-59cf4bdb65-wsntx\" (UID: \"12ecba83-5e7f-4bec-a930-a02540cbde61\") " pod="openstack/dnsmasq-dns-59cf4bdb65-wsntx" Oct 03 15:03:27 crc kubenswrapper[4774]: I1003 15:03:27.731503 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12ecba83-5e7f-4bec-a930-a02540cbde61-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-wsntx\" (UID: \"12ecba83-5e7f-4bec-a930-a02540cbde61\") " pod="openstack/dnsmasq-dns-59cf4bdb65-wsntx" Oct 03 15:03:27 crc kubenswrapper[4774]: I1003 15:03:27.731553 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/12ecba83-5e7f-4bec-a930-a02540cbde61-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-wsntx\" (UID: \"12ecba83-5e7f-4bec-a930-a02540cbde61\") " pod="openstack/dnsmasq-dns-59cf4bdb65-wsntx" Oct 03 15:03:27 crc kubenswrapper[4774]: I1003 15:03:27.731579 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12ecba83-5e7f-4bec-a930-a02540cbde61-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-wsntx\" (UID: \"12ecba83-5e7f-4bec-a930-a02540cbde61\") " pod="openstack/dnsmasq-dns-59cf4bdb65-wsntx" Oct 03 15:03:27 crc kubenswrapper[4774]: I1003 15:03:27.731612 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxqqw\" (UniqueName: \"kubernetes.io/projected/12ecba83-5e7f-4bec-a930-a02540cbde61-kube-api-access-sxqqw\") pod \"dnsmasq-dns-59cf4bdb65-wsntx\" (UID: \"12ecba83-5e7f-4bec-a930-a02540cbde61\") " pod="openstack/dnsmasq-dns-59cf4bdb65-wsntx" Oct 03 15:03:27 crc kubenswrapper[4774]: I1003 15:03:27.733008 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12ecba83-5e7f-4bec-a930-a02540cbde61-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-wsntx\" (UID: \"12ecba83-5e7f-4bec-a930-a02540cbde61\") " pod="openstack/dnsmasq-dns-59cf4bdb65-wsntx" Oct 03 15:03:27 crc kubenswrapper[4774]: I1003 15:03:27.736942 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/12ecba83-5e7f-4bec-a930-a02540cbde61-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-wsntx\" (UID: \"12ecba83-5e7f-4bec-a930-a02540cbde61\") " pod="openstack/dnsmasq-dns-59cf4bdb65-wsntx" Oct 03 15:03:27 crc kubenswrapper[4774]: I1003 15:03:27.738203 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12ecba83-5e7f-4bec-a930-a02540cbde61-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-wsntx\" (UID: \"12ecba83-5e7f-4bec-a930-a02540cbde61\") " pod="openstack/dnsmasq-dns-59cf4bdb65-wsntx" Oct 03 15:03:27 crc kubenswrapper[4774]: I1003 15:03:27.741965 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12ecba83-5e7f-4bec-a930-a02540cbde61-config\") pod \"dnsmasq-dns-59cf4bdb65-wsntx\" (UID: \"12ecba83-5e7f-4bec-a930-a02540cbde61\") " pod="openstack/dnsmasq-dns-59cf4bdb65-wsntx" Oct 03 15:03:27 crc kubenswrapper[4774]: I1003 15:03:27.743254 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12ecba83-5e7f-4bec-a930-a02540cbde61-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-wsntx\" (UID: \"12ecba83-5e7f-4bec-a930-a02540cbde61\") " pod="openstack/dnsmasq-dns-59cf4bdb65-wsntx" Oct 03 15:03:27 crc kubenswrapper[4774]: I1003 15:03:27.754908 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxqqw\" (UniqueName: \"kubernetes.io/projected/12ecba83-5e7f-4bec-a930-a02540cbde61-kube-api-access-sxqqw\") pod \"dnsmasq-dns-59cf4bdb65-wsntx\" (UID: \"12ecba83-5e7f-4bec-a930-a02540cbde61\") " pod="openstack/dnsmasq-dns-59cf4bdb65-wsntx" Oct 03 15:03:27 crc kubenswrapper[4774]: I1003 15:03:27.864858 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-wsntx" Oct 03 15:03:28 crc kubenswrapper[4774]: E1003 15:03:28.096067 4774 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/8b28575618e999080e85c5962893c1bc3470cdf313a7299922de8ff33d4378b1/diff" to get inode usage: stat /var/lib/containers/storage/overlay/8b28575618e999080e85c5962893c1bc3470cdf313a7299922de8ff33d4378b1/diff: no such file or directory, extraDiskErr: Oct 03 15:03:28 crc kubenswrapper[4774]: I1003 15:03:28.346776 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-wsntx"] Oct 03 15:03:28 crc kubenswrapper[4774]: W1003 15:03:28.632306 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1a3949e_e232_492e_98fe_47a948f55f73.slice/crio-07a03d55d2054a7881172ed076d932cd3cbd0f012fbf770f48570ea5cef4adfc WatchSource:0}: Error finding container 07a03d55d2054a7881172ed076d932cd3cbd0f012fbf770f48570ea5cef4adfc: Status 404 returned error can't find the container with id 07a03d55d2054a7881172ed076d932cd3cbd0f012fbf770f48570ea5cef4adfc Oct 03 15:03:28 crc kubenswrapper[4774]: E1003 15:03:28.633295 4774 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51d6dfe6_9084_4868_93c7_58fe660a6d98.slice/crio-4205844f257ec77cf70f31bbd93545ab598e4858460720b4b87a6db8b74377c2: Error finding container 4205844f257ec77cf70f31bbd93545ab598e4858460720b4b87a6db8b74377c2: Status 404 returned error can't find the container with id 4205844f257ec77cf70f31bbd93545ab598e4858460720b4b87a6db8b74377c2 Oct 03 15:03:28 crc kubenswrapper[4774]: W1003 15:03:28.633691 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1a3949e_e232_492e_98fe_47a948f55f73.slice/crio-bed249a2866a161e6b73cba119b1ce00685acd7b796346667016900a5afcf0e7.scope WatchSource:0}: Error finding container bed249a2866a161e6b73cba119b1ce00685acd7b796346667016900a5afcf0e7: Status 404 returned error can't find the container with id bed249a2866a161e6b73cba119b1ce00685acd7b796346667016900a5afcf0e7 Oct 03 15:03:28 crc kubenswrapper[4774]: W1003 15:03:28.634221 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1a3949e_e232_492e_98fe_47a948f55f73.slice/crio-572248f1d44fd639c647128a8abe1059f52cc44382591ea6036c088066686884.scope WatchSource:0}: Error finding container 572248f1d44fd639c647128a8abe1059f52cc44382591ea6036c088066686884: Status 404 returned error can't find the container with id 572248f1d44fd639c647128a8abe1059f52cc44382591ea6036c088066686884 Oct 03 15:03:28 crc kubenswrapper[4774]: E1003 15:03:28.933993 4774 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76464c8b_55de_4a35_91ab_8cee9db23bf7.slice/crio-3b177d13f217a9408fbf6d7a50f9f857bcf02e7fedff5439f601faa34883e513\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca37ac4b_f421_4198_a179_12901d36f0f5.slice/crio-conmon-3d631133d606feac7cf551b661ad83ff63af803d9385eff9dee1aa2b7ab7a1cd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcee99571_3f9a_49ba_bced_bbb3e3a723e7.slice/crio-ea00fdd4bff47426ae613c584b950e18dcd592788814ad50bf80cea80b9d70c9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca37ac4b_f421_4198_a179_12901d36f0f5.slice/crio-3d631133d606feac7cf551b661ad83ff63af803d9385eff9dee1aa2b7ab7a1cd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76464c8b_55de_4a35_91ab_8cee9db23bf7.slice\": RecentStats: unable to find data in memory cache]" Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.064160 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.134444 4774 generic.go:334] "Generic (PLEG): container finished" podID="12ecba83-5e7f-4bec-a930-a02540cbde61" containerID="77467c335237a1ec1f212a7b6867d099fc810f7c9b82767748f820e81d36e19d" exitCode=0 Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.134556 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-wsntx" event={"ID":"12ecba83-5e7f-4bec-a930-a02540cbde61","Type":"ContainerDied","Data":"77467c335237a1ec1f212a7b6867d099fc810f7c9b82767748f820e81d36e19d"} Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.134591 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-wsntx" event={"ID":"12ecba83-5e7f-4bec-a930-a02540cbde61","Type":"ContainerStarted","Data":"98ed8c910d82f0bb860453accdb94a4bb2537970d93be35c61190584082dcecb"} Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.136891 4774 generic.go:334] "Generic (PLEG): container finished" podID="cee99571-3f9a-49ba-bced-bbb3e3a723e7" containerID="ea00fdd4bff47426ae613c584b950e18dcd592788814ad50bf80cea80b9d70c9" exitCode=137 Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.136932 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.136993 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cee99571-3f9a-49ba-bced-bbb3e3a723e7","Type":"ContainerDied","Data":"ea00fdd4bff47426ae613c584b950e18dcd592788814ad50bf80cea80b9d70c9"} Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.137019 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cee99571-3f9a-49ba-bced-bbb3e3a723e7","Type":"ContainerDied","Data":"d9218f12af1f4b62b87429e5a9fa725c6a58ed2314b58761a08714084bf94ceb"} Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.137039 4774 scope.go:117] "RemoveContainer" containerID="ea00fdd4bff47426ae613c584b950e18dcd592788814ad50bf80cea80b9d70c9" Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.178126 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cee99571-3f9a-49ba-bced-bbb3e3a723e7-config-data\") pod \"cee99571-3f9a-49ba-bced-bbb3e3a723e7\" (UID: \"cee99571-3f9a-49ba-bced-bbb3e3a723e7\") " Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.178184 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t25h6\" (UniqueName: \"kubernetes.io/projected/cee99571-3f9a-49ba-bced-bbb3e3a723e7-kube-api-access-t25h6\") pod \"cee99571-3f9a-49ba-bced-bbb3e3a723e7\" (UID: \"cee99571-3f9a-49ba-bced-bbb3e3a723e7\") " Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.178243 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cee99571-3f9a-49ba-bced-bbb3e3a723e7-combined-ca-bundle\") pod \"cee99571-3f9a-49ba-bced-bbb3e3a723e7\" (UID: \"cee99571-3f9a-49ba-bced-bbb3e3a723e7\") " Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.187599 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cee99571-3f9a-49ba-bced-bbb3e3a723e7-kube-api-access-t25h6" (OuterVolumeSpecName: "kube-api-access-t25h6") pod "cee99571-3f9a-49ba-bced-bbb3e3a723e7" (UID: "cee99571-3f9a-49ba-bced-bbb3e3a723e7"). InnerVolumeSpecName "kube-api-access-t25h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.218170 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cee99571-3f9a-49ba-bced-bbb3e3a723e7-config-data" (OuterVolumeSpecName: "config-data") pod "cee99571-3f9a-49ba-bced-bbb3e3a723e7" (UID: "cee99571-3f9a-49ba-bced-bbb3e3a723e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.220911 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cee99571-3f9a-49ba-bced-bbb3e3a723e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cee99571-3f9a-49ba-bced-bbb3e3a723e7" (UID: "cee99571-3f9a-49ba-bced-bbb3e3a723e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.253641 4774 scope.go:117] "RemoveContainer" containerID="ea00fdd4bff47426ae613c584b950e18dcd592788814ad50bf80cea80b9d70c9" Oct 03 15:03:29 crc kubenswrapper[4774]: E1003 15:03:29.254056 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea00fdd4bff47426ae613c584b950e18dcd592788814ad50bf80cea80b9d70c9\": container with ID starting with ea00fdd4bff47426ae613c584b950e18dcd592788814ad50bf80cea80b9d70c9 not found: ID does not exist" containerID="ea00fdd4bff47426ae613c584b950e18dcd592788814ad50bf80cea80b9d70c9" Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.254142 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea00fdd4bff47426ae613c584b950e18dcd592788814ad50bf80cea80b9d70c9"} err="failed to get container status \"ea00fdd4bff47426ae613c584b950e18dcd592788814ad50bf80cea80b9d70c9\": rpc error: code = NotFound desc = could not find container \"ea00fdd4bff47426ae613c584b950e18dcd592788814ad50bf80cea80b9d70c9\": container with ID starting with ea00fdd4bff47426ae613c584b950e18dcd592788814ad50bf80cea80b9d70c9 not found: ID does not exist" Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.281071 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cee99571-3f9a-49ba-bced-bbb3e3a723e7-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.281105 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t25h6\" (UniqueName: \"kubernetes.io/projected/cee99571-3f9a-49ba-bced-bbb3e3a723e7-kube-api-access-t25h6\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.281117 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cee99571-3f9a-49ba-bced-bbb3e3a723e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.460106 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.474044 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.483748 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 15:03:29 crc kubenswrapper[4774]: E1003 15:03:29.484501 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cee99571-3f9a-49ba-bced-bbb3e3a723e7" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.484518 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee99571-3f9a-49ba-bced-bbb3e3a723e7" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.484734 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="cee99571-3f9a-49ba-bced-bbb3e3a723e7" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.485502 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.489214 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.489214 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.489876 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.495082 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.587147 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9gr2\" (UniqueName: \"kubernetes.io/projected/fec122f7-2237-49c0-b5aa-4e251827b058-kube-api-access-d9gr2\") pod \"nova-cell1-novncproxy-0\" (UID: \"fec122f7-2237-49c0-b5aa-4e251827b058\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.587215 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fec122f7-2237-49c0-b5aa-4e251827b058-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fec122f7-2237-49c0-b5aa-4e251827b058\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.587276 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fec122f7-2237-49c0-b5aa-4e251827b058-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fec122f7-2237-49c0-b5aa-4e251827b058\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.587299 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fec122f7-2237-49c0-b5aa-4e251827b058-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fec122f7-2237-49c0-b5aa-4e251827b058\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.587477 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fec122f7-2237-49c0-b5aa-4e251827b058-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fec122f7-2237-49c0-b5aa-4e251827b058\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.689146 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9gr2\" (UniqueName: \"kubernetes.io/projected/fec122f7-2237-49c0-b5aa-4e251827b058-kube-api-access-d9gr2\") pod \"nova-cell1-novncproxy-0\" (UID: \"fec122f7-2237-49c0-b5aa-4e251827b058\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.689224 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fec122f7-2237-49c0-b5aa-4e251827b058-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fec122f7-2237-49c0-b5aa-4e251827b058\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.689273 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fec122f7-2237-49c0-b5aa-4e251827b058-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fec122f7-2237-49c0-b5aa-4e251827b058\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.689293 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fec122f7-2237-49c0-b5aa-4e251827b058-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fec122f7-2237-49c0-b5aa-4e251827b058\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.689406 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fec122f7-2237-49c0-b5aa-4e251827b058-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fec122f7-2237-49c0-b5aa-4e251827b058\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.695446 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fec122f7-2237-49c0-b5aa-4e251827b058-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fec122f7-2237-49c0-b5aa-4e251827b058\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.695458 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fec122f7-2237-49c0-b5aa-4e251827b058-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fec122f7-2237-49c0-b5aa-4e251827b058\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.695806 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fec122f7-2237-49c0-b5aa-4e251827b058-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fec122f7-2237-49c0-b5aa-4e251827b058\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.696781 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fec122f7-2237-49c0-b5aa-4e251827b058-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fec122f7-2237-49c0-b5aa-4e251827b058\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.711437 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9gr2\" (UniqueName: \"kubernetes.io/projected/fec122f7-2237-49c0-b5aa-4e251827b058-kube-api-access-d9gr2\") pod \"nova-cell1-novncproxy-0\" (UID: \"fec122f7-2237-49c0-b5aa-4e251827b058\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.777805 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.778112 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="81e37fd3-224d-475c-bb4a-dd840dc3dd48" containerName="sg-core" containerID="cri-o://36c6a230b9ae15d9fc5517f74a49cc53ec0cd157ee350e534d9c2cb4d877897a" gracePeriod=30 Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.778179 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="81e37fd3-224d-475c-bb4a-dd840dc3dd48" containerName="proxy-httpd" containerID="cri-o://94174702c920dca8ed048feb8642fad03dbe1e88627679d2fe53e3fc3c4dbfcc" gracePeriod=30 Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.778341 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="81e37fd3-224d-475c-bb4a-dd840dc3dd48" containerName="ceilometer-notification-agent" containerID="cri-o://f58232fe6ee207596593c6637dbb4652f5612335c866ae4ea26e371fd5113d95" gracePeriod=30 Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.778451 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="81e37fd3-224d-475c-bb4a-dd840dc3dd48" containerName="ceilometer-central-agent" containerID="cri-o://71f6629c95f011d81aefc8a8aef6119ca3a8f22ace59b87dab2c29ee380c993a" gracePeriod=30 Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.801808 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="81e37fd3-224d-475c-bb4a-dd840dc3dd48" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Oct 03 15:03:29 crc kubenswrapper[4774]: I1003 15:03:29.820402 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 15:03:30 crc kubenswrapper[4774]: I1003 15:03:30.060697 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 15:03:30 crc kubenswrapper[4774]: I1003 15:03:30.151390 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-wsntx" event={"ID":"12ecba83-5e7f-4bec-a930-a02540cbde61","Type":"ContainerStarted","Data":"fd8f499f5e75df870fb2fcc1e32669506cbbe81ca292f4dd32d6228b8cbe4998"} Oct 03 15:03:30 crc kubenswrapper[4774]: I1003 15:03:30.152645 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59cf4bdb65-wsntx" Oct 03 15:03:30 crc kubenswrapper[4774]: I1003 15:03:30.163815 4774 generic.go:334] "Generic (PLEG): container finished" podID="81e37fd3-224d-475c-bb4a-dd840dc3dd48" containerID="94174702c920dca8ed048feb8642fad03dbe1e88627679d2fe53e3fc3c4dbfcc" exitCode=0 Oct 03 15:03:30 crc kubenswrapper[4774]: I1003 15:03:30.163853 4774 generic.go:334] "Generic (PLEG): container finished" podID="81e37fd3-224d-475c-bb4a-dd840dc3dd48" containerID="36c6a230b9ae15d9fc5517f74a49cc53ec0cd157ee350e534d9c2cb4d877897a" exitCode=2 Oct 03 15:03:30 crc kubenswrapper[4774]: I1003 15:03:30.163857 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81e37fd3-224d-475c-bb4a-dd840dc3dd48","Type":"ContainerDied","Data":"94174702c920dca8ed048feb8642fad03dbe1e88627679d2fe53e3fc3c4dbfcc"} Oct 03 15:03:30 crc kubenswrapper[4774]: I1003 15:03:30.164030 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81e37fd3-224d-475c-bb4a-dd840dc3dd48","Type":"ContainerDied","Data":"36c6a230b9ae15d9fc5517f74a49cc53ec0cd157ee350e534d9c2cb4d877897a"} Oct 03 15:03:30 crc kubenswrapper[4774]: I1003 15:03:30.164611 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="bf2fba80-784f-471a-a4ca-2437ab06c2cb" containerName="nova-api-log" containerID="cri-o://baa12dc79699561010471bd70a2484d5ff9f6cd6d74a289144312749d82b84e6" gracePeriod=30 Oct 03 15:03:30 crc kubenswrapper[4774]: I1003 15:03:30.164655 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="bf2fba80-784f-471a-a4ca-2437ab06c2cb" containerName="nova-api-api" containerID="cri-o://e9506a626cd3852670069df872c66874a93031ebc3852911f346f884e9800645" gracePeriod=30 Oct 03 15:03:30 crc kubenswrapper[4774]: I1003 15:03:30.198149 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59cf4bdb65-wsntx" podStartSLOduration=3.198128943 podStartE2EDuration="3.198128943s" podCreationTimestamp="2025-10-03 15:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:03:30.181786298 +0000 UTC m=+1232.770989750" watchObservedRunningTime="2025-10-03 15:03:30.198128943 +0000 UTC m=+1232.787332395" Oct 03 15:03:30 crc kubenswrapper[4774]: I1003 15:03:30.289170 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 15:03:31 crc kubenswrapper[4774]: I1003 15:03:31.173364 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fec122f7-2237-49c0-b5aa-4e251827b058","Type":"ContainerStarted","Data":"1f4919d87836adadb5f34a6d4eea2bfdcba57e1934f0957c4866cea6fb7acc07"} Oct 03 15:03:31 crc kubenswrapper[4774]: I1003 15:03:31.173738 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fec122f7-2237-49c0-b5aa-4e251827b058","Type":"ContainerStarted","Data":"3708cdbd73572bec6bee85e5371405bd35405ebf434865038b1400685723eaff"} Oct 03 15:03:31 crc kubenswrapper[4774]: I1003 15:03:31.183520 4774 generic.go:334] "Generic (PLEG): container finished" podID="81e37fd3-224d-475c-bb4a-dd840dc3dd48" containerID="71f6629c95f011d81aefc8a8aef6119ca3a8f22ace59b87dab2c29ee380c993a" exitCode=0 Oct 03 15:03:31 crc kubenswrapper[4774]: I1003 15:03:31.183657 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81e37fd3-224d-475c-bb4a-dd840dc3dd48","Type":"ContainerDied","Data":"71f6629c95f011d81aefc8a8aef6119ca3a8f22ace59b87dab2c29ee380c993a"} Oct 03 15:03:31 crc kubenswrapper[4774]: I1003 15:03:31.187569 4774 generic.go:334] "Generic (PLEG): container finished" podID="bf2fba80-784f-471a-a4ca-2437ab06c2cb" containerID="baa12dc79699561010471bd70a2484d5ff9f6cd6d74a289144312749d82b84e6" exitCode=143 Oct 03 15:03:31 crc kubenswrapper[4774]: I1003 15:03:31.187686 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bf2fba80-784f-471a-a4ca-2437ab06c2cb","Type":"ContainerDied","Data":"baa12dc79699561010471bd70a2484d5ff9f6cd6d74a289144312749d82b84e6"} Oct 03 15:03:31 crc kubenswrapper[4774]: I1003 15:03:31.310792 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cee99571-3f9a-49ba-bced-bbb3e3a723e7" path="/var/lib/kubelet/pods/cee99571-3f9a-49ba-bced-bbb3e3a723e7/volumes" Oct 03 15:03:33 crc kubenswrapper[4774]: I1003 15:03:33.768302 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 15:03:33 crc kubenswrapper[4774]: I1003 15:03:33.792545 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=4.792519572 podStartE2EDuration="4.792519572s" podCreationTimestamp="2025-10-03 15:03:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:03:31.195630877 +0000 UTC m=+1233.784834349" watchObservedRunningTime="2025-10-03 15:03:33.792519572 +0000 UTC m=+1236.381723024" Oct 03 15:03:33 crc kubenswrapper[4774]: I1003 15:03:33.870212 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf2fba80-784f-471a-a4ca-2437ab06c2cb-combined-ca-bundle\") pod \"bf2fba80-784f-471a-a4ca-2437ab06c2cb\" (UID: \"bf2fba80-784f-471a-a4ca-2437ab06c2cb\") " Oct 03 15:03:33 crc kubenswrapper[4774]: I1003 15:03:33.870357 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gcmk\" (UniqueName: \"kubernetes.io/projected/bf2fba80-784f-471a-a4ca-2437ab06c2cb-kube-api-access-7gcmk\") pod \"bf2fba80-784f-471a-a4ca-2437ab06c2cb\" (UID: \"bf2fba80-784f-471a-a4ca-2437ab06c2cb\") " Oct 03 15:03:33 crc kubenswrapper[4774]: I1003 15:03:33.870442 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf2fba80-784f-471a-a4ca-2437ab06c2cb-config-data\") pod \"bf2fba80-784f-471a-a4ca-2437ab06c2cb\" (UID: \"bf2fba80-784f-471a-a4ca-2437ab06c2cb\") " Oct 03 15:03:33 crc kubenswrapper[4774]: I1003 15:03:33.870460 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf2fba80-784f-471a-a4ca-2437ab06c2cb-logs\") pod \"bf2fba80-784f-471a-a4ca-2437ab06c2cb\" (UID: \"bf2fba80-784f-471a-a4ca-2437ab06c2cb\") " Oct 03 15:03:33 crc kubenswrapper[4774]: I1003 15:03:33.871446 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf2fba80-784f-471a-a4ca-2437ab06c2cb-logs" (OuterVolumeSpecName: "logs") pod "bf2fba80-784f-471a-a4ca-2437ab06c2cb" (UID: "bf2fba80-784f-471a-a4ca-2437ab06c2cb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:03:33 crc kubenswrapper[4774]: I1003 15:03:33.876429 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf2fba80-784f-471a-a4ca-2437ab06c2cb-kube-api-access-7gcmk" (OuterVolumeSpecName: "kube-api-access-7gcmk") pod "bf2fba80-784f-471a-a4ca-2437ab06c2cb" (UID: "bf2fba80-784f-471a-a4ca-2437ab06c2cb"). InnerVolumeSpecName "kube-api-access-7gcmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:03:33 crc kubenswrapper[4774]: I1003 15:03:33.906966 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf2fba80-784f-471a-a4ca-2437ab06c2cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf2fba80-784f-471a-a4ca-2437ab06c2cb" (UID: "bf2fba80-784f-471a-a4ca-2437ab06c2cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:33 crc kubenswrapper[4774]: I1003 15:03:33.910459 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf2fba80-784f-471a-a4ca-2437ab06c2cb-config-data" (OuterVolumeSpecName: "config-data") pod "bf2fba80-784f-471a-a4ca-2437ab06c2cb" (UID: "bf2fba80-784f-471a-a4ca-2437ab06c2cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:33 crc kubenswrapper[4774]: I1003 15:03:33.973354 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf2fba80-784f-471a-a4ca-2437ab06c2cb-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:33 crc kubenswrapper[4774]: I1003 15:03:33.973410 4774 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf2fba80-784f-471a-a4ca-2437ab06c2cb-logs\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:33 crc kubenswrapper[4774]: I1003 15:03:33.973419 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf2fba80-784f-471a-a4ca-2437ab06c2cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:33 crc kubenswrapper[4774]: I1003 15:03:33.973430 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gcmk\" (UniqueName: \"kubernetes.io/projected/bf2fba80-784f-471a-a4ca-2437ab06c2cb-kube-api-access-7gcmk\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.212922 4774 generic.go:334] "Generic (PLEG): container finished" podID="bf2fba80-784f-471a-a4ca-2437ab06c2cb" containerID="e9506a626cd3852670069df872c66874a93031ebc3852911f346f884e9800645" exitCode=0 Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.212960 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bf2fba80-784f-471a-a4ca-2437ab06c2cb","Type":"ContainerDied","Data":"e9506a626cd3852670069df872c66874a93031ebc3852911f346f884e9800645"} Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.213158 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bf2fba80-784f-471a-a4ca-2437ab06c2cb","Type":"ContainerDied","Data":"e13dfc468812f37c077f8d9f4a1f0bc5528dcb81f7a617ebf93c0d6476c3783d"} Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.213177 4774 scope.go:117] "RemoveContainer" containerID="e9506a626cd3852670069df872c66874a93031ebc3852911f346f884e9800645" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.213009 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.256059 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.264696 4774 scope.go:117] "RemoveContainer" containerID="baa12dc79699561010471bd70a2484d5ff9f6cd6d74a289144312749d82b84e6" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.283526 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.292550 4774 scope.go:117] "RemoveContainer" containerID="e9506a626cd3852670069df872c66874a93031ebc3852911f346f884e9800645" Oct 03 15:03:34 crc kubenswrapper[4774]: E1003 15:03:34.300038 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9506a626cd3852670069df872c66874a93031ebc3852911f346f884e9800645\": container with ID starting with e9506a626cd3852670069df872c66874a93031ebc3852911f346f884e9800645 not found: ID does not exist" containerID="e9506a626cd3852670069df872c66874a93031ebc3852911f346f884e9800645" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.300086 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9506a626cd3852670069df872c66874a93031ebc3852911f346f884e9800645"} err="failed to get container status \"e9506a626cd3852670069df872c66874a93031ebc3852911f346f884e9800645\": rpc error: code = NotFound desc = could not find container \"e9506a626cd3852670069df872c66874a93031ebc3852911f346f884e9800645\": container with ID starting with e9506a626cd3852670069df872c66874a93031ebc3852911f346f884e9800645 not found: ID does not exist" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.300111 4774 scope.go:117] "RemoveContainer" containerID="baa12dc79699561010471bd70a2484d5ff9f6cd6d74a289144312749d82b84e6" Oct 03 15:03:34 crc kubenswrapper[4774]: E1003 15:03:34.300471 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"baa12dc79699561010471bd70a2484d5ff9f6cd6d74a289144312749d82b84e6\": container with ID starting with baa12dc79699561010471bd70a2484d5ff9f6cd6d74a289144312749d82b84e6 not found: ID does not exist" containerID="baa12dc79699561010471bd70a2484d5ff9f6cd6d74a289144312749d82b84e6" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.300513 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"baa12dc79699561010471bd70a2484d5ff9f6cd6d74a289144312749d82b84e6"} err="failed to get container status \"baa12dc79699561010471bd70a2484d5ff9f6cd6d74a289144312749d82b84e6\": rpc error: code = NotFound desc = could not find container \"baa12dc79699561010471bd70a2484d5ff9f6cd6d74a289144312749d82b84e6\": container with ID starting with baa12dc79699561010471bd70a2484d5ff9f6cd6d74a289144312749d82b84e6 not found: ID does not exist" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.306265 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 15:03:34 crc kubenswrapper[4774]: E1003 15:03:34.306741 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf2fba80-784f-471a-a4ca-2437ab06c2cb" containerName="nova-api-api" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.306764 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf2fba80-784f-471a-a4ca-2437ab06c2cb" containerName="nova-api-api" Oct 03 15:03:34 crc kubenswrapper[4774]: E1003 15:03:34.306782 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf2fba80-784f-471a-a4ca-2437ab06c2cb" containerName="nova-api-log" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.306789 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf2fba80-784f-471a-a4ca-2437ab06c2cb" containerName="nova-api-log" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.307019 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf2fba80-784f-471a-a4ca-2437ab06c2cb" containerName="nova-api-log" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.307042 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf2fba80-784f-471a-a4ca-2437ab06c2cb" containerName="nova-api-api" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.312485 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.315506 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.315902 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.316129 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.324882 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.485171 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj5lp\" (UniqueName: \"kubernetes.io/projected/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-kube-api-access-qj5lp\") pod \"nova-api-0\" (UID: \"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3\") " pod="openstack/nova-api-0" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.485233 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-config-data\") pod \"nova-api-0\" (UID: \"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3\") " pod="openstack/nova-api-0" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.485314 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3\") " pod="openstack/nova-api-0" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.485354 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-logs\") pod \"nova-api-0\" (UID: \"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3\") " pod="openstack/nova-api-0" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.485384 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-public-tls-certs\") pod \"nova-api-0\" (UID: \"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3\") " pod="openstack/nova-api-0" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.485424 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3\") " pod="openstack/nova-api-0" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.586587 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3\") " pod="openstack/nova-api-0" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.586651 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-logs\") pod \"nova-api-0\" (UID: \"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3\") " pod="openstack/nova-api-0" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.586675 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-public-tls-certs\") pod \"nova-api-0\" (UID: \"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3\") " pod="openstack/nova-api-0" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.586718 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3\") " pod="openstack/nova-api-0" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.586762 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj5lp\" (UniqueName: \"kubernetes.io/projected/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-kube-api-access-qj5lp\") pod \"nova-api-0\" (UID: \"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3\") " pod="openstack/nova-api-0" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.586793 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-config-data\") pod \"nova-api-0\" (UID: \"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3\") " pod="openstack/nova-api-0" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.587108 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-logs\") pod \"nova-api-0\" (UID: \"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3\") " pod="openstack/nova-api-0" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.591307 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3\") " pod="openstack/nova-api-0" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.593246 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-config-data\") pod \"nova-api-0\" (UID: \"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3\") " pod="openstack/nova-api-0" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.593793 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-public-tls-certs\") pod \"nova-api-0\" (UID: \"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3\") " pod="openstack/nova-api-0" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.601341 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3\") " pod="openstack/nova-api-0" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.603644 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj5lp\" (UniqueName: \"kubernetes.io/projected/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-kube-api-access-qj5lp\") pod \"nova-api-0\" (UID: \"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3\") " pod="openstack/nova-api-0" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.659697 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 15:03:34 crc kubenswrapper[4774]: I1003 15:03:34.821163 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 03 15:03:35 crc kubenswrapper[4774]: I1003 15:03:35.185618 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 15:03:35 crc kubenswrapper[4774]: I1003 15:03:35.234978 4774 generic.go:334] "Generic (PLEG): container finished" podID="81e37fd3-224d-475c-bb4a-dd840dc3dd48" containerID="f58232fe6ee207596593c6637dbb4652f5612335c866ae4ea26e371fd5113d95" exitCode=0 Oct 03 15:03:35 crc kubenswrapper[4774]: I1003 15:03:35.235052 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81e37fd3-224d-475c-bb4a-dd840dc3dd48","Type":"ContainerDied","Data":"f58232fe6ee207596593c6637dbb4652f5612335c866ae4ea26e371fd5113d95"} Oct 03 15:03:35 crc kubenswrapper[4774]: I1003 15:03:35.237562 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3","Type":"ContainerStarted","Data":"cff5412ab2d246358ab1207465ab50d562bbd6b530d62aebffef4069c1e2be30"} Oct 03 15:03:35 crc kubenswrapper[4774]: I1003 15:03:35.314516 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf2fba80-784f-471a-a4ca-2437ab06c2cb" path="/var/lib/kubelet/pods/bf2fba80-784f-471a-a4ca-2437ab06c2cb/volumes" Oct 03 15:03:35 crc kubenswrapper[4774]: I1003 15:03:35.319919 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 15:03:35 crc kubenswrapper[4774]: I1003 15:03:35.405055 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81e37fd3-224d-475c-bb4a-dd840dc3dd48-log-httpd\") pod \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\" (UID: \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\") " Oct 03 15:03:35 crc kubenswrapper[4774]: I1003 15:03:35.405206 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81e37fd3-224d-475c-bb4a-dd840dc3dd48-combined-ca-bundle\") pod \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\" (UID: \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\") " Oct 03 15:03:35 crc kubenswrapper[4774]: I1003 15:03:35.405334 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfq4f\" (UniqueName: \"kubernetes.io/projected/81e37fd3-224d-475c-bb4a-dd840dc3dd48-kube-api-access-sfq4f\") pod \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\" (UID: \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\") " Oct 03 15:03:35 crc kubenswrapper[4774]: I1003 15:03:35.405493 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/81e37fd3-224d-475c-bb4a-dd840dc3dd48-ceilometer-tls-certs\") pod \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\" (UID: \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\") " Oct 03 15:03:35 crc kubenswrapper[4774]: I1003 15:03:35.406200 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81e37fd3-224d-475c-bb4a-dd840dc3dd48-config-data\") pod \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\" (UID: \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\") " Oct 03 15:03:35 crc kubenswrapper[4774]: I1003 15:03:35.406533 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81e37fd3-224d-475c-bb4a-dd840dc3dd48-run-httpd\") pod \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\" (UID: \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\") " Oct 03 15:03:35 crc kubenswrapper[4774]: I1003 15:03:35.405547 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81e37fd3-224d-475c-bb4a-dd840dc3dd48-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "81e37fd3-224d-475c-bb4a-dd840dc3dd48" (UID: "81e37fd3-224d-475c-bb4a-dd840dc3dd48"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:03:35 crc kubenswrapper[4774]: I1003 15:03:35.406803 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81e37fd3-224d-475c-bb4a-dd840dc3dd48-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "81e37fd3-224d-475c-bb4a-dd840dc3dd48" (UID: "81e37fd3-224d-475c-bb4a-dd840dc3dd48"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:03:35 crc kubenswrapper[4774]: I1003 15:03:35.407197 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81e37fd3-224d-475c-bb4a-dd840dc3dd48-scripts\") pod \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\" (UID: \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\") " Oct 03 15:03:35 crc kubenswrapper[4774]: I1003 15:03:35.407322 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81e37fd3-224d-475c-bb4a-dd840dc3dd48-sg-core-conf-yaml\") pod \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\" (UID: \"81e37fd3-224d-475c-bb4a-dd840dc3dd48\") " Oct 03 15:03:35 crc kubenswrapper[4774]: I1003 15:03:35.407992 4774 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81e37fd3-224d-475c-bb4a-dd840dc3dd48-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:35 crc kubenswrapper[4774]: I1003 15:03:35.408141 4774 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81e37fd3-224d-475c-bb4a-dd840dc3dd48-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:35 crc kubenswrapper[4774]: I1003 15:03:35.409976 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81e37fd3-224d-475c-bb4a-dd840dc3dd48-kube-api-access-sfq4f" (OuterVolumeSpecName: "kube-api-access-sfq4f") pod "81e37fd3-224d-475c-bb4a-dd840dc3dd48" (UID: "81e37fd3-224d-475c-bb4a-dd840dc3dd48"). InnerVolumeSpecName "kube-api-access-sfq4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:03:35 crc kubenswrapper[4774]: I1003 15:03:35.410324 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81e37fd3-224d-475c-bb4a-dd840dc3dd48-scripts" (OuterVolumeSpecName: "scripts") pod "81e37fd3-224d-475c-bb4a-dd840dc3dd48" (UID: "81e37fd3-224d-475c-bb4a-dd840dc3dd48"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:35 crc kubenswrapper[4774]: I1003 15:03:35.434695 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81e37fd3-224d-475c-bb4a-dd840dc3dd48-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "81e37fd3-224d-475c-bb4a-dd840dc3dd48" (UID: "81e37fd3-224d-475c-bb4a-dd840dc3dd48"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:35 crc kubenswrapper[4774]: I1003 15:03:35.489794 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81e37fd3-224d-475c-bb4a-dd840dc3dd48-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "81e37fd3-224d-475c-bb4a-dd840dc3dd48" (UID: "81e37fd3-224d-475c-bb4a-dd840dc3dd48"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:35 crc kubenswrapper[4774]: I1003 15:03:35.501699 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81e37fd3-224d-475c-bb4a-dd840dc3dd48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81e37fd3-224d-475c-bb4a-dd840dc3dd48" (UID: "81e37fd3-224d-475c-bb4a-dd840dc3dd48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:35 crc kubenswrapper[4774]: I1003 15:03:35.510216 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81e37fd3-224d-475c-bb4a-dd840dc3dd48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:35 crc kubenswrapper[4774]: I1003 15:03:35.510245 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfq4f\" (UniqueName: \"kubernetes.io/projected/81e37fd3-224d-475c-bb4a-dd840dc3dd48-kube-api-access-sfq4f\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:35 crc kubenswrapper[4774]: I1003 15:03:35.510260 4774 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/81e37fd3-224d-475c-bb4a-dd840dc3dd48-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:35 crc kubenswrapper[4774]: I1003 15:03:35.510272 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81e37fd3-224d-475c-bb4a-dd840dc3dd48-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:35 crc kubenswrapper[4774]: I1003 15:03:35.510284 4774 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81e37fd3-224d-475c-bb4a-dd840dc3dd48-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:35 crc kubenswrapper[4774]: I1003 15:03:35.520159 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81e37fd3-224d-475c-bb4a-dd840dc3dd48-config-data" (OuterVolumeSpecName: "config-data") pod "81e37fd3-224d-475c-bb4a-dd840dc3dd48" (UID: "81e37fd3-224d-475c-bb4a-dd840dc3dd48"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:35 crc kubenswrapper[4774]: I1003 15:03:35.611444 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81e37fd3-224d-475c-bb4a-dd840dc3dd48-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.252034 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81e37fd3-224d-475c-bb4a-dd840dc3dd48","Type":"ContainerDied","Data":"cdfe3ed7bc6a828b89de06c8f8a4a652114f05dd1ea0dd7adbd31efbec7daba2"} Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.252346 4774 scope.go:117] "RemoveContainer" containerID="94174702c920dca8ed048feb8642fad03dbe1e88627679d2fe53e3fc3c4dbfcc" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.252184 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.254983 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3","Type":"ContainerStarted","Data":"724829851ce9fa2a2522a20c98c6aaf3118982c6c1b46e6e9281b1d1e0c2796d"} Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.255030 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3","Type":"ContainerStarted","Data":"be3d8f01f0076a9148e2c7d154372173bec4890f900015812f6c33170f6700ec"} Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.280744 4774 scope.go:117] "RemoveContainer" containerID="36c6a230b9ae15d9fc5517f74a49cc53ec0cd157ee350e534d9c2cb4d877897a" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.295234 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.295216334 podStartE2EDuration="2.295216334s" podCreationTimestamp="2025-10-03 15:03:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:03:36.273154707 +0000 UTC m=+1238.862358169" watchObservedRunningTime="2025-10-03 15:03:36.295216334 +0000 UTC m=+1238.884419786" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.301339 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.307850 4774 scope.go:117] "RemoveContainer" containerID="f58232fe6ee207596593c6637dbb4652f5612335c866ae4ea26e371fd5113d95" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.309795 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.329554 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:03:36 crc kubenswrapper[4774]: E1003 15:03:36.329940 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81e37fd3-224d-475c-bb4a-dd840dc3dd48" containerName="ceilometer-central-agent" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.329958 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="81e37fd3-224d-475c-bb4a-dd840dc3dd48" containerName="ceilometer-central-agent" Oct 03 15:03:36 crc kubenswrapper[4774]: E1003 15:03:36.329969 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81e37fd3-224d-475c-bb4a-dd840dc3dd48" containerName="proxy-httpd" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.329979 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="81e37fd3-224d-475c-bb4a-dd840dc3dd48" containerName="proxy-httpd" Oct 03 15:03:36 crc kubenswrapper[4774]: E1003 15:03:36.329995 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81e37fd3-224d-475c-bb4a-dd840dc3dd48" containerName="ceilometer-notification-agent" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.330001 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="81e37fd3-224d-475c-bb4a-dd840dc3dd48" containerName="ceilometer-notification-agent" Oct 03 15:03:36 crc kubenswrapper[4774]: E1003 15:03:36.330012 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81e37fd3-224d-475c-bb4a-dd840dc3dd48" containerName="sg-core" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.330017 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="81e37fd3-224d-475c-bb4a-dd840dc3dd48" containerName="sg-core" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.330175 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="81e37fd3-224d-475c-bb4a-dd840dc3dd48" containerName="proxy-httpd" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.330190 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="81e37fd3-224d-475c-bb4a-dd840dc3dd48" containerName="sg-core" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.330206 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="81e37fd3-224d-475c-bb4a-dd840dc3dd48" containerName="ceilometer-notification-agent" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.330220 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="81e37fd3-224d-475c-bb4a-dd840dc3dd48" containerName="ceilometer-central-agent" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.331924 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.333564 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.337002 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.337182 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.354968 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.355339 4774 scope.go:117] "RemoveContainer" containerID="71f6629c95f011d81aefc8a8aef6119ca3a8f22ace59b87dab2c29ee380c993a" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.425369 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2lnd\" (UniqueName: \"kubernetes.io/projected/cebee680-e845-4735-95f6-e97d844399a3-kube-api-access-n2lnd\") pod \"ceilometer-0\" (UID: \"cebee680-e845-4735-95f6-e97d844399a3\") " pod="openstack/ceilometer-0" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.425450 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cebee680-e845-4735-95f6-e97d844399a3-config-data\") pod \"ceilometer-0\" (UID: \"cebee680-e845-4735-95f6-e97d844399a3\") " pod="openstack/ceilometer-0" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.425487 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cebee680-e845-4735-95f6-e97d844399a3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cebee680-e845-4735-95f6-e97d844399a3\") " pod="openstack/ceilometer-0" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.425602 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cebee680-e845-4735-95f6-e97d844399a3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cebee680-e845-4735-95f6-e97d844399a3\") " pod="openstack/ceilometer-0" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.425755 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cebee680-e845-4735-95f6-e97d844399a3-log-httpd\") pod \"ceilometer-0\" (UID: \"cebee680-e845-4735-95f6-e97d844399a3\") " pod="openstack/ceilometer-0" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.425883 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cebee680-e845-4735-95f6-e97d844399a3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cebee680-e845-4735-95f6-e97d844399a3\") " pod="openstack/ceilometer-0" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.425938 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cebee680-e845-4735-95f6-e97d844399a3-run-httpd\") pod \"ceilometer-0\" (UID: \"cebee680-e845-4735-95f6-e97d844399a3\") " pod="openstack/ceilometer-0" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.425979 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cebee680-e845-4735-95f6-e97d844399a3-scripts\") pod \"ceilometer-0\" (UID: \"cebee680-e845-4735-95f6-e97d844399a3\") " pod="openstack/ceilometer-0" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.527986 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cebee680-e845-4735-95f6-e97d844399a3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cebee680-e845-4735-95f6-e97d844399a3\") " pod="openstack/ceilometer-0" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.528051 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cebee680-e845-4735-95f6-e97d844399a3-log-httpd\") pod \"ceilometer-0\" (UID: \"cebee680-e845-4735-95f6-e97d844399a3\") " pod="openstack/ceilometer-0" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.528132 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cebee680-e845-4735-95f6-e97d844399a3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cebee680-e845-4735-95f6-e97d844399a3\") " pod="openstack/ceilometer-0" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.528165 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cebee680-e845-4735-95f6-e97d844399a3-run-httpd\") pod \"ceilometer-0\" (UID: \"cebee680-e845-4735-95f6-e97d844399a3\") " pod="openstack/ceilometer-0" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.528191 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cebee680-e845-4735-95f6-e97d844399a3-scripts\") pod \"ceilometer-0\" (UID: \"cebee680-e845-4735-95f6-e97d844399a3\") " pod="openstack/ceilometer-0" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.528314 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2lnd\" (UniqueName: \"kubernetes.io/projected/cebee680-e845-4735-95f6-e97d844399a3-kube-api-access-n2lnd\") pod \"ceilometer-0\" (UID: \"cebee680-e845-4735-95f6-e97d844399a3\") " pod="openstack/ceilometer-0" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.528339 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cebee680-e845-4735-95f6-e97d844399a3-config-data\") pod \"ceilometer-0\" (UID: \"cebee680-e845-4735-95f6-e97d844399a3\") " pod="openstack/ceilometer-0" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.528384 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cebee680-e845-4735-95f6-e97d844399a3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cebee680-e845-4735-95f6-e97d844399a3\") " pod="openstack/ceilometer-0" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.528761 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cebee680-e845-4735-95f6-e97d844399a3-log-httpd\") pod \"ceilometer-0\" (UID: \"cebee680-e845-4735-95f6-e97d844399a3\") " pod="openstack/ceilometer-0" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.529492 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cebee680-e845-4735-95f6-e97d844399a3-run-httpd\") pod \"ceilometer-0\" (UID: \"cebee680-e845-4735-95f6-e97d844399a3\") " pod="openstack/ceilometer-0" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.540282 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cebee680-e845-4735-95f6-e97d844399a3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cebee680-e845-4735-95f6-e97d844399a3\") " pod="openstack/ceilometer-0" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.541062 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cebee680-e845-4735-95f6-e97d844399a3-scripts\") pod \"ceilometer-0\" (UID: \"cebee680-e845-4735-95f6-e97d844399a3\") " pod="openstack/ceilometer-0" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.541099 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cebee680-e845-4735-95f6-e97d844399a3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cebee680-e845-4735-95f6-e97d844399a3\") " pod="openstack/ceilometer-0" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.541527 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cebee680-e845-4735-95f6-e97d844399a3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cebee680-e845-4735-95f6-e97d844399a3\") " pod="openstack/ceilometer-0" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.542286 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cebee680-e845-4735-95f6-e97d844399a3-config-data\") pod \"ceilometer-0\" (UID: \"cebee680-e845-4735-95f6-e97d844399a3\") " pod="openstack/ceilometer-0" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.547800 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2lnd\" (UniqueName: \"kubernetes.io/projected/cebee680-e845-4735-95f6-e97d844399a3-kube-api-access-n2lnd\") pod \"ceilometer-0\" (UID: \"cebee680-e845-4735-95f6-e97d844399a3\") " pod="openstack/ceilometer-0" Oct 03 15:03:36 crc kubenswrapper[4774]: I1003 15:03:36.658827 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 15:03:37 crc kubenswrapper[4774]: I1003 15:03:37.143739 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 15:03:37 crc kubenswrapper[4774]: I1003 15:03:37.263903 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cebee680-e845-4735-95f6-e97d844399a3","Type":"ContainerStarted","Data":"028a388f4039eaae5f2a33925ad238cb59b5a78d9563308e7440a364e522760d"} Oct 03 15:03:37 crc kubenswrapper[4774]: I1003 15:03:37.313171 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81e37fd3-224d-475c-bb4a-dd840dc3dd48" path="/var/lib/kubelet/pods/81e37fd3-224d-475c-bb4a-dd840dc3dd48/volumes" Oct 03 15:03:37 crc kubenswrapper[4774]: I1003 15:03:37.866578 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59cf4bdb65-wsntx" Oct 03 15:03:37 crc kubenswrapper[4774]: I1003 15:03:37.935225 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-jshth"] Oct 03 15:03:37 crc kubenswrapper[4774]: I1003 15:03:37.935693 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-845d6d6f59-jshth" podUID="a4786927-81ff-4f7e-9c17-558e01bf47fe" containerName="dnsmasq-dns" containerID="cri-o://94938aa23d0960766d355e1b39181d660e7ad8202dcf755e1064d794eff42e3c" gracePeriod=10 Oct 03 15:03:38 crc kubenswrapper[4774]: I1003 15:03:38.297249 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cebee680-e845-4735-95f6-e97d844399a3","Type":"ContainerStarted","Data":"3221bbfc568b31fa3979d275a027d93f582124b455570c41407165f12fba7d42"} Oct 03 15:03:38 crc kubenswrapper[4774]: I1003 15:03:38.301444 4774 generic.go:334] "Generic (PLEG): container finished" podID="a4786927-81ff-4f7e-9c17-558e01bf47fe" containerID="94938aa23d0960766d355e1b39181d660e7ad8202dcf755e1064d794eff42e3c" exitCode=0 Oct 03 15:03:38 crc kubenswrapper[4774]: I1003 15:03:38.321402 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-jshth" event={"ID":"a4786927-81ff-4f7e-9c17-558e01bf47fe","Type":"ContainerDied","Data":"94938aa23d0960766d355e1b39181d660e7ad8202dcf755e1064d794eff42e3c"} Oct 03 15:03:38 crc kubenswrapper[4774]: I1003 15:03:38.542325 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-jshth" Oct 03 15:03:38 crc kubenswrapper[4774]: I1003 15:03:38.699083 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4786927-81ff-4f7e-9c17-558e01bf47fe-dns-svc\") pod \"a4786927-81ff-4f7e-9c17-558e01bf47fe\" (UID: \"a4786927-81ff-4f7e-9c17-558e01bf47fe\") " Oct 03 15:03:38 crc kubenswrapper[4774]: I1003 15:03:38.699173 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4pk2\" (UniqueName: \"kubernetes.io/projected/a4786927-81ff-4f7e-9c17-558e01bf47fe-kube-api-access-m4pk2\") pod \"a4786927-81ff-4f7e-9c17-558e01bf47fe\" (UID: \"a4786927-81ff-4f7e-9c17-558e01bf47fe\") " Oct 03 15:03:38 crc kubenswrapper[4774]: I1003 15:03:38.699272 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4786927-81ff-4f7e-9c17-558e01bf47fe-dns-swift-storage-0\") pod \"a4786927-81ff-4f7e-9c17-558e01bf47fe\" (UID: \"a4786927-81ff-4f7e-9c17-558e01bf47fe\") " Oct 03 15:03:38 crc kubenswrapper[4774]: I1003 15:03:38.699317 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4786927-81ff-4f7e-9c17-558e01bf47fe-config\") pod \"a4786927-81ff-4f7e-9c17-558e01bf47fe\" (UID: \"a4786927-81ff-4f7e-9c17-558e01bf47fe\") " Oct 03 15:03:38 crc kubenswrapper[4774]: I1003 15:03:38.699349 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4786927-81ff-4f7e-9c17-558e01bf47fe-ovsdbserver-nb\") pod \"a4786927-81ff-4f7e-9c17-558e01bf47fe\" (UID: \"a4786927-81ff-4f7e-9c17-558e01bf47fe\") " Oct 03 15:03:38 crc kubenswrapper[4774]: I1003 15:03:38.699410 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4786927-81ff-4f7e-9c17-558e01bf47fe-ovsdbserver-sb\") pod \"a4786927-81ff-4f7e-9c17-558e01bf47fe\" (UID: \"a4786927-81ff-4f7e-9c17-558e01bf47fe\") " Oct 03 15:03:38 crc kubenswrapper[4774]: I1003 15:03:38.704143 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4786927-81ff-4f7e-9c17-558e01bf47fe-kube-api-access-m4pk2" (OuterVolumeSpecName: "kube-api-access-m4pk2") pod "a4786927-81ff-4f7e-9c17-558e01bf47fe" (UID: "a4786927-81ff-4f7e-9c17-558e01bf47fe"). InnerVolumeSpecName "kube-api-access-m4pk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:03:38 crc kubenswrapper[4774]: I1003 15:03:38.749004 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4786927-81ff-4f7e-9c17-558e01bf47fe-config" (OuterVolumeSpecName: "config") pod "a4786927-81ff-4f7e-9c17-558e01bf47fe" (UID: "a4786927-81ff-4f7e-9c17-558e01bf47fe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:03:38 crc kubenswrapper[4774]: I1003 15:03:38.752064 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4786927-81ff-4f7e-9c17-558e01bf47fe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a4786927-81ff-4f7e-9c17-558e01bf47fe" (UID: "a4786927-81ff-4f7e-9c17-558e01bf47fe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:03:38 crc kubenswrapper[4774]: I1003 15:03:38.764689 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4786927-81ff-4f7e-9c17-558e01bf47fe-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a4786927-81ff-4f7e-9c17-558e01bf47fe" (UID: "a4786927-81ff-4f7e-9c17-558e01bf47fe"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:03:38 crc kubenswrapper[4774]: I1003 15:03:38.767215 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4786927-81ff-4f7e-9c17-558e01bf47fe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a4786927-81ff-4f7e-9c17-558e01bf47fe" (UID: "a4786927-81ff-4f7e-9c17-558e01bf47fe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:03:38 crc kubenswrapper[4774]: I1003 15:03:38.768794 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4786927-81ff-4f7e-9c17-558e01bf47fe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a4786927-81ff-4f7e-9c17-558e01bf47fe" (UID: "a4786927-81ff-4f7e-9c17-558e01bf47fe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:03:38 crc kubenswrapper[4774]: I1003 15:03:38.801903 4774 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4786927-81ff-4f7e-9c17-558e01bf47fe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:38 crc kubenswrapper[4774]: I1003 15:03:38.801939 4774 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4786927-81ff-4f7e-9c17-558e01bf47fe-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:38 crc kubenswrapper[4774]: I1003 15:03:38.801950 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4pk2\" (UniqueName: \"kubernetes.io/projected/a4786927-81ff-4f7e-9c17-558e01bf47fe-kube-api-access-m4pk2\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:38 crc kubenswrapper[4774]: I1003 15:03:38.801964 4774 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4786927-81ff-4f7e-9c17-558e01bf47fe-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:38 crc kubenswrapper[4774]: I1003 15:03:38.801973 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4786927-81ff-4f7e-9c17-558e01bf47fe-config\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:38 crc kubenswrapper[4774]: I1003 15:03:38.801982 4774 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4786927-81ff-4f7e-9c17-558e01bf47fe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:39 crc kubenswrapper[4774]: I1003 15:03:39.312323 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-jshth" Oct 03 15:03:39 crc kubenswrapper[4774]: I1003 15:03:39.312353 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-jshth" event={"ID":"a4786927-81ff-4f7e-9c17-558e01bf47fe","Type":"ContainerDied","Data":"d04ef0c70f16a01663a74572f22767932b5fa4a623cb9c7a57afa61380f02f23"} Oct 03 15:03:39 crc kubenswrapper[4774]: I1003 15:03:39.312730 4774 scope.go:117] "RemoveContainer" containerID="94938aa23d0960766d355e1b39181d660e7ad8202dcf755e1064d794eff42e3c" Oct 03 15:03:39 crc kubenswrapper[4774]: I1003 15:03:39.320582 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cebee680-e845-4735-95f6-e97d844399a3","Type":"ContainerStarted","Data":"f88e542123f835f3d49319172afdc38bff4cee4fbc95463b239909ee2b99ea51"} Oct 03 15:03:39 crc kubenswrapper[4774]: I1003 15:03:39.337570 4774 scope.go:117] "RemoveContainer" containerID="36893f18d2527b640238e2e667734c422f32680734c79fa1d94a6ba50ec5f8b5" Oct 03 15:03:39 crc kubenswrapper[4774]: I1003 15:03:39.378984 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-jshth"] Oct 03 15:03:39 crc kubenswrapper[4774]: I1003 15:03:39.389329 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-jshth"] Oct 03 15:03:39 crc kubenswrapper[4774]: I1003 15:03:39.822583 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 03 15:03:39 crc kubenswrapper[4774]: I1003 15:03:39.865251 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 03 15:03:40 crc kubenswrapper[4774]: I1003 15:03:40.330007 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cebee680-e845-4735-95f6-e97d844399a3","Type":"ContainerStarted","Data":"b8a5e114d5bc8e01bb12978188390c554224c28a8020148d872ec0f278d723e3"} Oct 03 15:03:40 crc kubenswrapper[4774]: I1003 15:03:40.345405 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 03 15:03:40 crc kubenswrapper[4774]: I1003 15:03:40.495696 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-8qhs9"] Oct 03 15:03:40 crc kubenswrapper[4774]: E1003 15:03:40.499749 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4786927-81ff-4f7e-9c17-558e01bf47fe" containerName="dnsmasq-dns" Oct 03 15:03:40 crc kubenswrapper[4774]: I1003 15:03:40.499779 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4786927-81ff-4f7e-9c17-558e01bf47fe" containerName="dnsmasq-dns" Oct 03 15:03:40 crc kubenswrapper[4774]: E1003 15:03:40.499828 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4786927-81ff-4f7e-9c17-558e01bf47fe" containerName="init" Oct 03 15:03:40 crc kubenswrapper[4774]: I1003 15:03:40.499837 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4786927-81ff-4f7e-9c17-558e01bf47fe" containerName="init" Oct 03 15:03:40 crc kubenswrapper[4774]: I1003 15:03:40.500298 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4786927-81ff-4f7e-9c17-558e01bf47fe" containerName="dnsmasq-dns" Oct 03 15:03:40 crc kubenswrapper[4774]: I1003 15:03:40.501042 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8qhs9" Oct 03 15:03:40 crc kubenswrapper[4774]: I1003 15:03:40.505871 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 03 15:03:40 crc kubenswrapper[4774]: I1003 15:03:40.509526 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 03 15:03:40 crc kubenswrapper[4774]: I1003 15:03:40.517599 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-8qhs9"] Oct 03 15:03:40 crc kubenswrapper[4774]: I1003 15:03:40.641787 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5abc11a-7d66-4906-9629-0ce6a1dd5264-scripts\") pod \"nova-cell1-cell-mapping-8qhs9\" (UID: \"c5abc11a-7d66-4906-9629-0ce6a1dd5264\") " pod="openstack/nova-cell1-cell-mapping-8qhs9" Oct 03 15:03:40 crc kubenswrapper[4774]: I1003 15:03:40.641987 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5abc11a-7d66-4906-9629-0ce6a1dd5264-config-data\") pod \"nova-cell1-cell-mapping-8qhs9\" (UID: \"c5abc11a-7d66-4906-9629-0ce6a1dd5264\") " pod="openstack/nova-cell1-cell-mapping-8qhs9" Oct 03 15:03:40 crc kubenswrapper[4774]: I1003 15:03:40.642079 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5abc11a-7d66-4906-9629-0ce6a1dd5264-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-8qhs9\" (UID: \"c5abc11a-7d66-4906-9629-0ce6a1dd5264\") " pod="openstack/nova-cell1-cell-mapping-8qhs9" Oct 03 15:03:40 crc kubenswrapper[4774]: I1003 15:03:40.642183 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9mkh\" (UniqueName: \"kubernetes.io/projected/c5abc11a-7d66-4906-9629-0ce6a1dd5264-kube-api-access-k9mkh\") pod \"nova-cell1-cell-mapping-8qhs9\" (UID: \"c5abc11a-7d66-4906-9629-0ce6a1dd5264\") " pod="openstack/nova-cell1-cell-mapping-8qhs9" Oct 03 15:03:40 crc kubenswrapper[4774]: I1003 15:03:40.743643 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9mkh\" (UniqueName: \"kubernetes.io/projected/c5abc11a-7d66-4906-9629-0ce6a1dd5264-kube-api-access-k9mkh\") pod \"nova-cell1-cell-mapping-8qhs9\" (UID: \"c5abc11a-7d66-4906-9629-0ce6a1dd5264\") " pod="openstack/nova-cell1-cell-mapping-8qhs9" Oct 03 15:03:40 crc kubenswrapper[4774]: I1003 15:03:40.743701 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5abc11a-7d66-4906-9629-0ce6a1dd5264-scripts\") pod \"nova-cell1-cell-mapping-8qhs9\" (UID: \"c5abc11a-7d66-4906-9629-0ce6a1dd5264\") " pod="openstack/nova-cell1-cell-mapping-8qhs9" Oct 03 15:03:40 crc kubenswrapper[4774]: I1003 15:03:40.743754 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5abc11a-7d66-4906-9629-0ce6a1dd5264-config-data\") pod \"nova-cell1-cell-mapping-8qhs9\" (UID: \"c5abc11a-7d66-4906-9629-0ce6a1dd5264\") " pod="openstack/nova-cell1-cell-mapping-8qhs9" Oct 03 15:03:40 crc kubenswrapper[4774]: I1003 15:03:40.743788 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5abc11a-7d66-4906-9629-0ce6a1dd5264-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-8qhs9\" (UID: \"c5abc11a-7d66-4906-9629-0ce6a1dd5264\") " pod="openstack/nova-cell1-cell-mapping-8qhs9" Oct 03 15:03:40 crc kubenswrapper[4774]: I1003 15:03:40.750686 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5abc11a-7d66-4906-9629-0ce6a1dd5264-scripts\") pod \"nova-cell1-cell-mapping-8qhs9\" (UID: \"c5abc11a-7d66-4906-9629-0ce6a1dd5264\") " pod="openstack/nova-cell1-cell-mapping-8qhs9" Oct 03 15:03:40 crc kubenswrapper[4774]: I1003 15:03:40.753941 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5abc11a-7d66-4906-9629-0ce6a1dd5264-config-data\") pod \"nova-cell1-cell-mapping-8qhs9\" (UID: \"c5abc11a-7d66-4906-9629-0ce6a1dd5264\") " pod="openstack/nova-cell1-cell-mapping-8qhs9" Oct 03 15:03:40 crc kubenswrapper[4774]: I1003 15:03:40.755603 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5abc11a-7d66-4906-9629-0ce6a1dd5264-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-8qhs9\" (UID: \"c5abc11a-7d66-4906-9629-0ce6a1dd5264\") " pod="openstack/nova-cell1-cell-mapping-8qhs9" Oct 03 15:03:40 crc kubenswrapper[4774]: I1003 15:03:40.762150 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9mkh\" (UniqueName: \"kubernetes.io/projected/c5abc11a-7d66-4906-9629-0ce6a1dd5264-kube-api-access-k9mkh\") pod \"nova-cell1-cell-mapping-8qhs9\" (UID: \"c5abc11a-7d66-4906-9629-0ce6a1dd5264\") " pod="openstack/nova-cell1-cell-mapping-8qhs9" Oct 03 15:03:40 crc kubenswrapper[4774]: I1003 15:03:40.834105 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8qhs9" Oct 03 15:03:41 crc kubenswrapper[4774]: I1003 15:03:41.282761 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-8qhs9"] Oct 03 15:03:41 crc kubenswrapper[4774]: I1003 15:03:41.315950 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4786927-81ff-4f7e-9c17-558e01bf47fe" path="/var/lib/kubelet/pods/a4786927-81ff-4f7e-9c17-558e01bf47fe/volumes" Oct 03 15:03:41 crc kubenswrapper[4774]: I1003 15:03:41.341224 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8qhs9" event={"ID":"c5abc11a-7d66-4906-9629-0ce6a1dd5264","Type":"ContainerStarted","Data":"51ae25ca0f18d2459eeba2887605ad6f97176b4e301094d646055d320ed61c79"} Oct 03 15:03:42 crc kubenswrapper[4774]: I1003 15:03:42.350735 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8qhs9" event={"ID":"c5abc11a-7d66-4906-9629-0ce6a1dd5264","Type":"ContainerStarted","Data":"da3d46aff0ef3b843d744709380108d10fec0a26379fd469ffab543b18ba3ba9"} Oct 03 15:03:42 crc kubenswrapper[4774]: I1003 15:03:42.353587 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cebee680-e845-4735-95f6-e97d844399a3","Type":"ContainerStarted","Data":"3b5a593f9ba72cc93c71162784134b2fdd282eb09069758dfa1cc6fe4fa879f0"} Oct 03 15:03:42 crc kubenswrapper[4774]: I1003 15:03:42.354700 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 15:03:42 crc kubenswrapper[4774]: I1003 15:03:42.371054 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-8qhs9" podStartSLOduration=2.371031587 podStartE2EDuration="2.371031587s" podCreationTimestamp="2025-10-03 15:03:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:03:42.369311344 +0000 UTC m=+1244.958514796" watchObservedRunningTime="2025-10-03 15:03:42.371031587 +0000 UTC m=+1244.960235059" Oct 03 15:03:42 crc kubenswrapper[4774]: I1003 15:03:42.401665 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.400899624 podStartE2EDuration="6.401645125s" podCreationTimestamp="2025-10-03 15:03:36 +0000 UTC" firstStartedPulling="2025-10-03 15:03:37.15244561 +0000 UTC m=+1239.741649062" lastFinishedPulling="2025-10-03 15:03:41.153191111 +0000 UTC m=+1243.742394563" observedRunningTime="2025-10-03 15:03:42.393563405 +0000 UTC m=+1244.982766857" watchObservedRunningTime="2025-10-03 15:03:42.401645125 +0000 UTC m=+1244.990848577" Oct 03 15:03:44 crc kubenswrapper[4774]: I1003 15:03:44.660216 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 15:03:44 crc kubenswrapper[4774]: I1003 15:03:44.660597 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 15:03:45 crc kubenswrapper[4774]: I1003 15:03:45.672633 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0d6f2892-c5c3-4552-b7e6-2cc10d92afb3" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.202:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 15:03:45 crc kubenswrapper[4774]: I1003 15:03:45.672686 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0d6f2892-c5c3-4552-b7e6-2cc10d92afb3" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.202:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 15:03:46 crc kubenswrapper[4774]: I1003 15:03:46.456911 4774 generic.go:334] "Generic (PLEG): container finished" podID="c5abc11a-7d66-4906-9629-0ce6a1dd5264" containerID="da3d46aff0ef3b843d744709380108d10fec0a26379fd469ffab543b18ba3ba9" exitCode=0 Oct 03 15:03:46 crc kubenswrapper[4774]: I1003 15:03:46.456975 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8qhs9" event={"ID":"c5abc11a-7d66-4906-9629-0ce6a1dd5264","Type":"ContainerDied","Data":"da3d46aff0ef3b843d744709380108d10fec0a26379fd469ffab543b18ba3ba9"} Oct 03 15:03:48 crc kubenswrapper[4774]: I1003 15:03:48.006201 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8qhs9" Oct 03 15:03:48 crc kubenswrapper[4774]: I1003 15:03:48.085754 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9mkh\" (UniqueName: \"kubernetes.io/projected/c5abc11a-7d66-4906-9629-0ce6a1dd5264-kube-api-access-k9mkh\") pod \"c5abc11a-7d66-4906-9629-0ce6a1dd5264\" (UID: \"c5abc11a-7d66-4906-9629-0ce6a1dd5264\") " Oct 03 15:03:48 crc kubenswrapper[4774]: I1003 15:03:48.086075 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5abc11a-7d66-4906-9629-0ce6a1dd5264-scripts\") pod \"c5abc11a-7d66-4906-9629-0ce6a1dd5264\" (UID: \"c5abc11a-7d66-4906-9629-0ce6a1dd5264\") " Oct 03 15:03:48 crc kubenswrapper[4774]: I1003 15:03:48.086314 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5abc11a-7d66-4906-9629-0ce6a1dd5264-combined-ca-bundle\") pod \"c5abc11a-7d66-4906-9629-0ce6a1dd5264\" (UID: \"c5abc11a-7d66-4906-9629-0ce6a1dd5264\") " Oct 03 15:03:48 crc kubenswrapper[4774]: I1003 15:03:48.086358 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5abc11a-7d66-4906-9629-0ce6a1dd5264-config-data\") pod \"c5abc11a-7d66-4906-9629-0ce6a1dd5264\" (UID: \"c5abc11a-7d66-4906-9629-0ce6a1dd5264\") " Oct 03 15:03:48 crc kubenswrapper[4774]: I1003 15:03:48.091715 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5abc11a-7d66-4906-9629-0ce6a1dd5264-kube-api-access-k9mkh" (OuterVolumeSpecName: "kube-api-access-k9mkh") pod "c5abc11a-7d66-4906-9629-0ce6a1dd5264" (UID: "c5abc11a-7d66-4906-9629-0ce6a1dd5264"). InnerVolumeSpecName "kube-api-access-k9mkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:03:48 crc kubenswrapper[4774]: I1003 15:03:48.092857 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5abc11a-7d66-4906-9629-0ce6a1dd5264-scripts" (OuterVolumeSpecName: "scripts") pod "c5abc11a-7d66-4906-9629-0ce6a1dd5264" (UID: "c5abc11a-7d66-4906-9629-0ce6a1dd5264"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:48 crc kubenswrapper[4774]: I1003 15:03:48.113856 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5abc11a-7d66-4906-9629-0ce6a1dd5264-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5abc11a-7d66-4906-9629-0ce6a1dd5264" (UID: "c5abc11a-7d66-4906-9629-0ce6a1dd5264"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:48 crc kubenswrapper[4774]: I1003 15:03:48.142724 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5abc11a-7d66-4906-9629-0ce6a1dd5264-config-data" (OuterVolumeSpecName: "config-data") pod "c5abc11a-7d66-4906-9629-0ce6a1dd5264" (UID: "c5abc11a-7d66-4906-9629-0ce6a1dd5264"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:48 crc kubenswrapper[4774]: I1003 15:03:48.188615 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9mkh\" (UniqueName: \"kubernetes.io/projected/c5abc11a-7d66-4906-9629-0ce6a1dd5264-kube-api-access-k9mkh\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:48 crc kubenswrapper[4774]: I1003 15:03:48.188655 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5abc11a-7d66-4906-9629-0ce6a1dd5264-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:48 crc kubenswrapper[4774]: I1003 15:03:48.188665 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5abc11a-7d66-4906-9629-0ce6a1dd5264-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:48 crc kubenswrapper[4774]: I1003 15:03:48.188674 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5abc11a-7d66-4906-9629-0ce6a1dd5264-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:48 crc kubenswrapper[4774]: I1003 15:03:48.484411 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8qhs9" event={"ID":"c5abc11a-7d66-4906-9629-0ce6a1dd5264","Type":"ContainerDied","Data":"51ae25ca0f18d2459eeba2887605ad6f97176b4e301094d646055d320ed61c79"} Oct 03 15:03:48 crc kubenswrapper[4774]: I1003 15:03:48.484466 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51ae25ca0f18d2459eeba2887605ad6f97176b4e301094d646055d320ed61c79" Oct 03 15:03:48 crc kubenswrapper[4774]: I1003 15:03:48.484468 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8qhs9" Oct 03 15:03:48 crc kubenswrapper[4774]: I1003 15:03:48.668086 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 15:03:48 crc kubenswrapper[4774]: I1003 15:03:48.668661 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0d6f2892-c5c3-4552-b7e6-2cc10d92afb3" containerName="nova-api-api" containerID="cri-o://724829851ce9fa2a2522a20c98c6aaf3118982c6c1b46e6e9281b1d1e0c2796d" gracePeriod=30 Oct 03 15:03:48 crc kubenswrapper[4774]: I1003 15:03:48.668876 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0d6f2892-c5c3-4552-b7e6-2cc10d92afb3" containerName="nova-api-log" containerID="cri-o://be3d8f01f0076a9148e2c7d154372173bec4890f900015812f6c33170f6700ec" gracePeriod=30 Oct 03 15:03:48 crc kubenswrapper[4774]: I1003 15:03:48.683439 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 15:03:48 crc kubenswrapper[4774]: I1003 15:03:48.683654 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="14f3b6b4-467a-4848-aed7-abeffc7767ad" containerName="nova-scheduler-scheduler" containerID="cri-o://1325cc23a9db1562b57aa0518519bba5423158658166edf999ebd9eee30d8fd2" gracePeriod=30 Oct 03 15:03:48 crc kubenswrapper[4774]: I1003 15:03:48.706914 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 15:03:48 crc kubenswrapper[4774]: I1003 15:03:48.707473 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2a062c05-477a-48c4-be23-ec0690970699" containerName="nova-metadata-log" containerID="cri-o://b600f1ca5179462b2efcc9c62bea9d524a7cf73dbf7053485dbde0c034625316" gracePeriod=30 Oct 03 15:03:48 crc kubenswrapper[4774]: I1003 15:03:48.707542 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2a062c05-477a-48c4-be23-ec0690970699" containerName="nova-metadata-metadata" containerID="cri-o://47ae8ed37e143ddafa7ded2644ff870b4905267ab7b74f7dbdf29f3f7faf9026" gracePeriod=30 Oct 03 15:03:49 crc kubenswrapper[4774]: I1003 15:03:49.464528 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 15:03:49 crc kubenswrapper[4774]: I1003 15:03:49.500519 4774 generic.go:334] "Generic (PLEG): container finished" podID="0d6f2892-c5c3-4552-b7e6-2cc10d92afb3" containerID="be3d8f01f0076a9148e2c7d154372173bec4890f900015812f6c33170f6700ec" exitCode=143 Oct 03 15:03:49 crc kubenswrapper[4774]: I1003 15:03:49.500589 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3","Type":"ContainerDied","Data":"be3d8f01f0076a9148e2c7d154372173bec4890f900015812f6c33170f6700ec"} Oct 03 15:03:49 crc kubenswrapper[4774]: I1003 15:03:49.503269 4774 generic.go:334] "Generic (PLEG): container finished" podID="2a062c05-477a-48c4-be23-ec0690970699" containerID="b600f1ca5179462b2efcc9c62bea9d524a7cf73dbf7053485dbde0c034625316" exitCode=143 Oct 03 15:03:49 crc kubenswrapper[4774]: I1003 15:03:49.503323 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2a062c05-477a-48c4-be23-ec0690970699","Type":"ContainerDied","Data":"b600f1ca5179462b2efcc9c62bea9d524a7cf73dbf7053485dbde0c034625316"} Oct 03 15:03:49 crc kubenswrapper[4774]: I1003 15:03:49.505141 4774 generic.go:334] "Generic (PLEG): container finished" podID="14f3b6b4-467a-4848-aed7-abeffc7767ad" containerID="1325cc23a9db1562b57aa0518519bba5423158658166edf999ebd9eee30d8fd2" exitCode=0 Oct 03 15:03:49 crc kubenswrapper[4774]: I1003 15:03:49.505269 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"14f3b6b4-467a-4848-aed7-abeffc7767ad","Type":"ContainerDied","Data":"1325cc23a9db1562b57aa0518519bba5423158658166edf999ebd9eee30d8fd2"} Oct 03 15:03:49 crc kubenswrapper[4774]: I1003 15:03:49.505386 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"14f3b6b4-467a-4848-aed7-abeffc7767ad","Type":"ContainerDied","Data":"a9b9725ee56e0d8287edcf441e2a0a0c0e93d6c5c1064d4a7479c1c5d46b91fb"} Oct 03 15:03:49 crc kubenswrapper[4774]: I1003 15:03:49.505495 4774 scope.go:117] "RemoveContainer" containerID="1325cc23a9db1562b57aa0518519bba5423158658166edf999ebd9eee30d8fd2" Oct 03 15:03:49 crc kubenswrapper[4774]: I1003 15:03:49.505730 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 15:03:49 crc kubenswrapper[4774]: I1003 15:03:49.531674 4774 scope.go:117] "RemoveContainer" containerID="1325cc23a9db1562b57aa0518519bba5423158658166edf999ebd9eee30d8fd2" Oct 03 15:03:49 crc kubenswrapper[4774]: E1003 15:03:49.534573 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1325cc23a9db1562b57aa0518519bba5423158658166edf999ebd9eee30d8fd2\": container with ID starting with 1325cc23a9db1562b57aa0518519bba5423158658166edf999ebd9eee30d8fd2 not found: ID does not exist" containerID="1325cc23a9db1562b57aa0518519bba5423158658166edf999ebd9eee30d8fd2" Oct 03 15:03:49 crc kubenswrapper[4774]: I1003 15:03:49.534708 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1325cc23a9db1562b57aa0518519bba5423158658166edf999ebd9eee30d8fd2"} err="failed to get container status \"1325cc23a9db1562b57aa0518519bba5423158658166edf999ebd9eee30d8fd2\": rpc error: code = NotFound desc = could not find container \"1325cc23a9db1562b57aa0518519bba5423158658166edf999ebd9eee30d8fd2\": container with ID starting with 1325cc23a9db1562b57aa0518519bba5423158658166edf999ebd9eee30d8fd2 not found: ID does not exist" Oct 03 15:03:49 crc kubenswrapper[4774]: I1003 15:03:49.624205 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14f3b6b4-467a-4848-aed7-abeffc7767ad-config-data\") pod \"14f3b6b4-467a-4848-aed7-abeffc7767ad\" (UID: \"14f3b6b4-467a-4848-aed7-abeffc7767ad\") " Oct 03 15:03:49 crc kubenswrapper[4774]: I1003 15:03:49.624657 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wh5p4\" (UniqueName: \"kubernetes.io/projected/14f3b6b4-467a-4848-aed7-abeffc7767ad-kube-api-access-wh5p4\") pod \"14f3b6b4-467a-4848-aed7-abeffc7767ad\" (UID: \"14f3b6b4-467a-4848-aed7-abeffc7767ad\") " Oct 03 15:03:49 crc kubenswrapper[4774]: I1003 15:03:49.624757 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14f3b6b4-467a-4848-aed7-abeffc7767ad-combined-ca-bundle\") pod \"14f3b6b4-467a-4848-aed7-abeffc7767ad\" (UID: \"14f3b6b4-467a-4848-aed7-abeffc7767ad\") " Oct 03 15:03:49 crc kubenswrapper[4774]: I1003 15:03:49.629626 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14f3b6b4-467a-4848-aed7-abeffc7767ad-kube-api-access-wh5p4" (OuterVolumeSpecName: "kube-api-access-wh5p4") pod "14f3b6b4-467a-4848-aed7-abeffc7767ad" (UID: "14f3b6b4-467a-4848-aed7-abeffc7767ad"). InnerVolumeSpecName "kube-api-access-wh5p4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:03:49 crc kubenswrapper[4774]: I1003 15:03:49.673974 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14f3b6b4-467a-4848-aed7-abeffc7767ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14f3b6b4-467a-4848-aed7-abeffc7767ad" (UID: "14f3b6b4-467a-4848-aed7-abeffc7767ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:49 crc kubenswrapper[4774]: I1003 15:03:49.677181 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14f3b6b4-467a-4848-aed7-abeffc7767ad-config-data" (OuterVolumeSpecName: "config-data") pod "14f3b6b4-467a-4848-aed7-abeffc7767ad" (UID: "14f3b6b4-467a-4848-aed7-abeffc7767ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:49 crc kubenswrapper[4774]: I1003 15:03:49.726739 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14f3b6b4-467a-4848-aed7-abeffc7767ad-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:49 crc kubenswrapper[4774]: I1003 15:03:49.726794 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wh5p4\" (UniqueName: \"kubernetes.io/projected/14f3b6b4-467a-4848-aed7-abeffc7767ad-kube-api-access-wh5p4\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:49 crc kubenswrapper[4774]: I1003 15:03:49.726805 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14f3b6b4-467a-4848-aed7-abeffc7767ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:49 crc kubenswrapper[4774]: I1003 15:03:49.853889 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 15:03:49 crc kubenswrapper[4774]: I1003 15:03:49.864884 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 15:03:49 crc kubenswrapper[4774]: I1003 15:03:49.872100 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 15:03:49 crc kubenswrapper[4774]: E1003 15:03:49.872498 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5abc11a-7d66-4906-9629-0ce6a1dd5264" containerName="nova-manage" Oct 03 15:03:49 crc kubenswrapper[4774]: I1003 15:03:49.872517 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5abc11a-7d66-4906-9629-0ce6a1dd5264" containerName="nova-manage" Oct 03 15:03:49 crc kubenswrapper[4774]: E1003 15:03:49.872556 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14f3b6b4-467a-4848-aed7-abeffc7767ad" containerName="nova-scheduler-scheduler" Oct 03 15:03:49 crc kubenswrapper[4774]: I1003 15:03:49.872568 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="14f3b6b4-467a-4848-aed7-abeffc7767ad" containerName="nova-scheduler-scheduler" Oct 03 15:03:49 crc kubenswrapper[4774]: I1003 15:03:49.872769 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="14f3b6b4-467a-4848-aed7-abeffc7767ad" containerName="nova-scheduler-scheduler" Oct 03 15:03:49 crc kubenswrapper[4774]: I1003 15:03:49.872792 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5abc11a-7d66-4906-9629-0ce6a1dd5264" containerName="nova-manage" Oct 03 15:03:49 crc kubenswrapper[4774]: I1003 15:03:49.873391 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 15:03:49 crc kubenswrapper[4774]: I1003 15:03:49.879275 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 03 15:03:49 crc kubenswrapper[4774]: I1003 15:03:49.906133 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 15:03:50 crc kubenswrapper[4774]: I1003 15:03:50.031707 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e38e5521-609a-4612-ad67-512c8a477e77-config-data\") pod \"nova-scheduler-0\" (UID: \"e38e5521-609a-4612-ad67-512c8a477e77\") " pod="openstack/nova-scheduler-0" Oct 03 15:03:50 crc kubenswrapper[4774]: I1003 15:03:50.031803 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e38e5521-609a-4612-ad67-512c8a477e77-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e38e5521-609a-4612-ad67-512c8a477e77\") " pod="openstack/nova-scheduler-0" Oct 03 15:03:50 crc kubenswrapper[4774]: I1003 15:03:50.031835 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n879b\" (UniqueName: \"kubernetes.io/projected/e38e5521-609a-4612-ad67-512c8a477e77-kube-api-access-n879b\") pod \"nova-scheduler-0\" (UID: \"e38e5521-609a-4612-ad67-512c8a477e77\") " pod="openstack/nova-scheduler-0" Oct 03 15:03:50 crc kubenswrapper[4774]: I1003 15:03:50.133347 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e38e5521-609a-4612-ad67-512c8a477e77-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e38e5521-609a-4612-ad67-512c8a477e77\") " pod="openstack/nova-scheduler-0" Oct 03 15:03:50 crc kubenswrapper[4774]: I1003 15:03:50.133432 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n879b\" (UniqueName: \"kubernetes.io/projected/e38e5521-609a-4612-ad67-512c8a477e77-kube-api-access-n879b\") pod \"nova-scheduler-0\" (UID: \"e38e5521-609a-4612-ad67-512c8a477e77\") " pod="openstack/nova-scheduler-0" Oct 03 15:03:50 crc kubenswrapper[4774]: I1003 15:03:50.133645 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e38e5521-609a-4612-ad67-512c8a477e77-config-data\") pod \"nova-scheduler-0\" (UID: \"e38e5521-609a-4612-ad67-512c8a477e77\") " pod="openstack/nova-scheduler-0" Oct 03 15:03:50 crc kubenswrapper[4774]: I1003 15:03:50.137427 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e38e5521-609a-4612-ad67-512c8a477e77-config-data\") pod \"nova-scheduler-0\" (UID: \"e38e5521-609a-4612-ad67-512c8a477e77\") " pod="openstack/nova-scheduler-0" Oct 03 15:03:50 crc kubenswrapper[4774]: I1003 15:03:50.138443 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e38e5521-609a-4612-ad67-512c8a477e77-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e38e5521-609a-4612-ad67-512c8a477e77\") " pod="openstack/nova-scheduler-0" Oct 03 15:03:50 crc kubenswrapper[4774]: I1003 15:03:50.152123 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n879b\" (UniqueName: \"kubernetes.io/projected/e38e5521-609a-4612-ad67-512c8a477e77-kube-api-access-n879b\") pod \"nova-scheduler-0\" (UID: \"e38e5521-609a-4612-ad67-512c8a477e77\") " pod="openstack/nova-scheduler-0" Oct 03 15:03:50 crc kubenswrapper[4774]: I1003 15:03:50.262931 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 15:03:50 crc kubenswrapper[4774]: I1003 15:03:50.702933 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 15:03:50 crc kubenswrapper[4774]: W1003 15:03:50.708450 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode38e5521_609a_4612_ad67_512c8a477e77.slice/crio-bfc5aa9d52fe6701a872dbbe1926970543fed128bb1b6c59e37a066d086bc228 WatchSource:0}: Error finding container bfc5aa9d52fe6701a872dbbe1926970543fed128bb1b6c59e37a066d086bc228: Status 404 returned error can't find the container with id bfc5aa9d52fe6701a872dbbe1926970543fed128bb1b6c59e37a066d086bc228 Oct 03 15:03:51 crc kubenswrapper[4774]: I1003 15:03:51.311526 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14f3b6b4-467a-4848-aed7-abeffc7767ad" path="/var/lib/kubelet/pods/14f3b6b4-467a-4848-aed7-abeffc7767ad/volumes" Oct 03 15:03:51 crc kubenswrapper[4774]: I1003 15:03:51.542400 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e38e5521-609a-4612-ad67-512c8a477e77","Type":"ContainerStarted","Data":"9e7c76ff8b100c394fee9cea6f1f5be78b5dc1785f058f28a296b4c60240f65c"} Oct 03 15:03:51 crc kubenswrapper[4774]: I1003 15:03:51.542464 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e38e5521-609a-4612-ad67-512c8a477e77","Type":"ContainerStarted","Data":"bfc5aa9d52fe6701a872dbbe1926970543fed128bb1b6c59e37a066d086bc228"} Oct 03 15:03:51 crc kubenswrapper[4774]: I1003 15:03:51.570266 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.570243145 podStartE2EDuration="2.570243145s" podCreationTimestamp="2025-10-03 15:03:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:03:51.563494187 +0000 UTC m=+1254.152697649" watchObservedRunningTime="2025-10-03 15:03:51.570243145 +0000 UTC m=+1254.159446607" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.431987 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.442326 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.488059 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-internal-tls-certs\") pod \"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3\" (UID: \"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3\") " Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.488308 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-combined-ca-bundle\") pod \"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3\" (UID: \"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3\") " Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.488396 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-config-data\") pod \"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3\" (UID: \"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3\") " Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.488464 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-public-tls-certs\") pod \"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3\" (UID: \"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3\") " Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.488505 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj5lp\" (UniqueName: \"kubernetes.io/projected/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-kube-api-access-qj5lp\") pod \"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3\" (UID: \"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3\") " Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.488540 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-logs\") pod \"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3\" (UID: \"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3\") " Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.490785 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-logs" (OuterVolumeSpecName: "logs") pod "0d6f2892-c5c3-4552-b7e6-2cc10d92afb3" (UID: "0d6f2892-c5c3-4552-b7e6-2cc10d92afb3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.493975 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-kube-api-access-qj5lp" (OuterVolumeSpecName: "kube-api-access-qj5lp") pod "0d6f2892-c5c3-4552-b7e6-2cc10d92afb3" (UID: "0d6f2892-c5c3-4552-b7e6-2cc10d92afb3"). InnerVolumeSpecName "kube-api-access-qj5lp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.521214 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-config-data" (OuterVolumeSpecName: "config-data") pod "0d6f2892-c5c3-4552-b7e6-2cc10d92afb3" (UID: "0d6f2892-c5c3-4552-b7e6-2cc10d92afb3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.537837 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d6f2892-c5c3-4552-b7e6-2cc10d92afb3" (UID: "0d6f2892-c5c3-4552-b7e6-2cc10d92afb3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.558793 4774 generic.go:334] "Generic (PLEG): container finished" podID="2a062c05-477a-48c4-be23-ec0690970699" containerID="47ae8ed37e143ddafa7ded2644ff870b4905267ab7b74f7dbdf29f3f7faf9026" exitCode=0 Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.558865 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2a062c05-477a-48c4-be23-ec0690970699","Type":"ContainerDied","Data":"47ae8ed37e143ddafa7ded2644ff870b4905267ab7b74f7dbdf29f3f7faf9026"} Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.558897 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2a062c05-477a-48c4-be23-ec0690970699","Type":"ContainerDied","Data":"f307919fda045c366e28355b8dcf1587519c4775c86caaf112a4aa116ff59122"} Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.558917 4774 scope.go:117] "RemoveContainer" containerID="47ae8ed37e143ddafa7ded2644ff870b4905267ab7b74f7dbdf29f3f7faf9026" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.559062 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.563166 4774 generic.go:334] "Generic (PLEG): container finished" podID="0d6f2892-c5c3-4552-b7e6-2cc10d92afb3" containerID="724829851ce9fa2a2522a20c98c6aaf3118982c6c1b46e6e9281b1d1e0c2796d" exitCode=0 Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.563573 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.564060 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3","Type":"ContainerDied","Data":"724829851ce9fa2a2522a20c98c6aaf3118982c6c1b46e6e9281b1d1e0c2796d"} Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.564082 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0d6f2892-c5c3-4552-b7e6-2cc10d92afb3","Type":"ContainerDied","Data":"cff5412ab2d246358ab1207465ab50d562bbd6b530d62aebffef4069c1e2be30"} Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.583481 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0d6f2892-c5c3-4552-b7e6-2cc10d92afb3" (UID: "0d6f2892-c5c3-4552-b7e6-2cc10d92afb3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.584540 4774 scope.go:117] "RemoveContainer" containerID="b600f1ca5179462b2efcc9c62bea9d524a7cf73dbf7053485dbde0c034625316" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.589579 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0d6f2892-c5c3-4552-b7e6-2cc10d92afb3" (UID: "0d6f2892-c5c3-4552-b7e6-2cc10d92afb3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.598861 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a062c05-477a-48c4-be23-ec0690970699-nova-metadata-tls-certs\") pod \"2a062c05-477a-48c4-be23-ec0690970699\" (UID: \"2a062c05-477a-48c4-be23-ec0690970699\") " Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.599313 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a062c05-477a-48c4-be23-ec0690970699-combined-ca-bundle\") pod \"2a062c05-477a-48c4-be23-ec0690970699\" (UID: \"2a062c05-477a-48c4-be23-ec0690970699\") " Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.599386 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a062c05-477a-48c4-be23-ec0690970699-logs\") pod \"2a062c05-477a-48c4-be23-ec0690970699\" (UID: \"2a062c05-477a-48c4-be23-ec0690970699\") " Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.599429 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a062c05-477a-48c4-be23-ec0690970699-config-data\") pod \"2a062c05-477a-48c4-be23-ec0690970699\" (UID: \"2a062c05-477a-48c4-be23-ec0690970699\") " Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.599518 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcsdm\" (UniqueName: \"kubernetes.io/projected/2a062c05-477a-48c4-be23-ec0690970699-kube-api-access-xcsdm\") pod \"2a062c05-477a-48c4-be23-ec0690970699\" (UID: \"2a062c05-477a-48c4-be23-ec0690970699\") " Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.600066 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.600083 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.600094 4774 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.600105 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qj5lp\" (UniqueName: \"kubernetes.io/projected/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-kube-api-access-qj5lp\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.600119 4774 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-logs\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.600130 4774 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.600254 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a062c05-477a-48c4-be23-ec0690970699-logs" (OuterVolumeSpecName: "logs") pod "2a062c05-477a-48c4-be23-ec0690970699" (UID: "2a062c05-477a-48c4-be23-ec0690970699"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.603085 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a062c05-477a-48c4-be23-ec0690970699-kube-api-access-xcsdm" (OuterVolumeSpecName: "kube-api-access-xcsdm") pod "2a062c05-477a-48c4-be23-ec0690970699" (UID: "2a062c05-477a-48c4-be23-ec0690970699"). InnerVolumeSpecName "kube-api-access-xcsdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.605463 4774 scope.go:117] "RemoveContainer" containerID="47ae8ed37e143ddafa7ded2644ff870b4905267ab7b74f7dbdf29f3f7faf9026" Oct 03 15:03:52 crc kubenswrapper[4774]: E1003 15:03:52.605983 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47ae8ed37e143ddafa7ded2644ff870b4905267ab7b74f7dbdf29f3f7faf9026\": container with ID starting with 47ae8ed37e143ddafa7ded2644ff870b4905267ab7b74f7dbdf29f3f7faf9026 not found: ID does not exist" containerID="47ae8ed37e143ddafa7ded2644ff870b4905267ab7b74f7dbdf29f3f7faf9026" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.606042 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47ae8ed37e143ddafa7ded2644ff870b4905267ab7b74f7dbdf29f3f7faf9026"} err="failed to get container status \"47ae8ed37e143ddafa7ded2644ff870b4905267ab7b74f7dbdf29f3f7faf9026\": rpc error: code = NotFound desc = could not find container \"47ae8ed37e143ddafa7ded2644ff870b4905267ab7b74f7dbdf29f3f7faf9026\": container with ID starting with 47ae8ed37e143ddafa7ded2644ff870b4905267ab7b74f7dbdf29f3f7faf9026 not found: ID does not exist" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.606079 4774 scope.go:117] "RemoveContainer" containerID="b600f1ca5179462b2efcc9c62bea9d524a7cf73dbf7053485dbde0c034625316" Oct 03 15:03:52 crc kubenswrapper[4774]: E1003 15:03:52.606434 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b600f1ca5179462b2efcc9c62bea9d524a7cf73dbf7053485dbde0c034625316\": container with ID starting with b600f1ca5179462b2efcc9c62bea9d524a7cf73dbf7053485dbde0c034625316 not found: ID does not exist" containerID="b600f1ca5179462b2efcc9c62bea9d524a7cf73dbf7053485dbde0c034625316" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.606495 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b600f1ca5179462b2efcc9c62bea9d524a7cf73dbf7053485dbde0c034625316"} err="failed to get container status \"b600f1ca5179462b2efcc9c62bea9d524a7cf73dbf7053485dbde0c034625316\": rpc error: code = NotFound desc = could not find container \"b600f1ca5179462b2efcc9c62bea9d524a7cf73dbf7053485dbde0c034625316\": container with ID starting with b600f1ca5179462b2efcc9c62bea9d524a7cf73dbf7053485dbde0c034625316 not found: ID does not exist" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.606524 4774 scope.go:117] "RemoveContainer" containerID="724829851ce9fa2a2522a20c98c6aaf3118982c6c1b46e6e9281b1d1e0c2796d" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.628053 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a062c05-477a-48c4-be23-ec0690970699-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a062c05-477a-48c4-be23-ec0690970699" (UID: "2a062c05-477a-48c4-be23-ec0690970699"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.631400 4774 scope.go:117] "RemoveContainer" containerID="be3d8f01f0076a9148e2c7d154372173bec4890f900015812f6c33170f6700ec" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.636553 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a062c05-477a-48c4-be23-ec0690970699-config-data" (OuterVolumeSpecName: "config-data") pod "2a062c05-477a-48c4-be23-ec0690970699" (UID: "2a062c05-477a-48c4-be23-ec0690970699"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.657787 4774 scope.go:117] "RemoveContainer" containerID="724829851ce9fa2a2522a20c98c6aaf3118982c6c1b46e6e9281b1d1e0c2796d" Oct 03 15:03:52 crc kubenswrapper[4774]: E1003 15:03:52.658223 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"724829851ce9fa2a2522a20c98c6aaf3118982c6c1b46e6e9281b1d1e0c2796d\": container with ID starting with 724829851ce9fa2a2522a20c98c6aaf3118982c6c1b46e6e9281b1d1e0c2796d not found: ID does not exist" containerID="724829851ce9fa2a2522a20c98c6aaf3118982c6c1b46e6e9281b1d1e0c2796d" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.658264 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"724829851ce9fa2a2522a20c98c6aaf3118982c6c1b46e6e9281b1d1e0c2796d"} err="failed to get container status \"724829851ce9fa2a2522a20c98c6aaf3118982c6c1b46e6e9281b1d1e0c2796d\": rpc error: code = NotFound desc = could not find container \"724829851ce9fa2a2522a20c98c6aaf3118982c6c1b46e6e9281b1d1e0c2796d\": container with ID starting with 724829851ce9fa2a2522a20c98c6aaf3118982c6c1b46e6e9281b1d1e0c2796d not found: ID does not exist" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.658290 4774 scope.go:117] "RemoveContainer" containerID="be3d8f01f0076a9148e2c7d154372173bec4890f900015812f6c33170f6700ec" Oct 03 15:03:52 crc kubenswrapper[4774]: E1003 15:03:52.658609 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be3d8f01f0076a9148e2c7d154372173bec4890f900015812f6c33170f6700ec\": container with ID starting with be3d8f01f0076a9148e2c7d154372173bec4890f900015812f6c33170f6700ec not found: ID does not exist" containerID="be3d8f01f0076a9148e2c7d154372173bec4890f900015812f6c33170f6700ec" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.658627 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be3d8f01f0076a9148e2c7d154372173bec4890f900015812f6c33170f6700ec"} err="failed to get container status \"be3d8f01f0076a9148e2c7d154372173bec4890f900015812f6c33170f6700ec\": rpc error: code = NotFound desc = could not find container \"be3d8f01f0076a9148e2c7d154372173bec4890f900015812f6c33170f6700ec\": container with ID starting with be3d8f01f0076a9148e2c7d154372173bec4890f900015812f6c33170f6700ec not found: ID does not exist" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.663963 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a062c05-477a-48c4-be23-ec0690970699-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "2a062c05-477a-48c4-be23-ec0690970699" (UID: "2a062c05-477a-48c4-be23-ec0690970699"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.702094 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a062c05-477a-48c4-be23-ec0690970699-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.702133 4774 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a062c05-477a-48c4-be23-ec0690970699-logs\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.702142 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a062c05-477a-48c4-be23-ec0690970699-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.702151 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcsdm\" (UniqueName: \"kubernetes.io/projected/2a062c05-477a-48c4-be23-ec0690970699-kube-api-access-xcsdm\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.702161 4774 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a062c05-477a-48c4-be23-ec0690970699-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.910890 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.921979 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.979337 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 15:03:52 crc kubenswrapper[4774]: I1003 15:03:52.998897 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.011398 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 15:03:53 crc kubenswrapper[4774]: E1003 15:03:53.011815 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d6f2892-c5c3-4552-b7e6-2cc10d92afb3" containerName="nova-api-log" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.011835 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d6f2892-c5c3-4552-b7e6-2cc10d92afb3" containerName="nova-api-log" Oct 03 15:03:53 crc kubenswrapper[4774]: E1003 15:03:53.012645 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d6f2892-c5c3-4552-b7e6-2cc10d92afb3" containerName="nova-api-api" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.012654 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d6f2892-c5c3-4552-b7e6-2cc10d92afb3" containerName="nova-api-api" Oct 03 15:03:53 crc kubenswrapper[4774]: E1003 15:03:53.012678 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a062c05-477a-48c4-be23-ec0690970699" containerName="nova-metadata-metadata" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.012686 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a062c05-477a-48c4-be23-ec0690970699" containerName="nova-metadata-metadata" Oct 03 15:03:53 crc kubenswrapper[4774]: E1003 15:03:53.012704 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a062c05-477a-48c4-be23-ec0690970699" containerName="nova-metadata-log" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.012710 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a062c05-477a-48c4-be23-ec0690970699" containerName="nova-metadata-log" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.013272 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a062c05-477a-48c4-be23-ec0690970699" containerName="nova-metadata-log" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.013291 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a062c05-477a-48c4-be23-ec0690970699" containerName="nova-metadata-metadata" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.013317 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d6f2892-c5c3-4552-b7e6-2cc10d92afb3" containerName="nova-api-log" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.013332 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d6f2892-c5c3-4552-b7e6-2cc10d92afb3" containerName="nova-api-api" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.014288 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.016732 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.017186 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.022504 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.024716 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.027560 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.027807 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.028972 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.035005 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.055185 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.108769 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bplw\" (UniqueName: \"kubernetes.io/projected/3208c93b-4d66-477f-8255-677d70a111a1-kube-api-access-6bplw\") pod \"nova-metadata-0\" (UID: \"3208c93b-4d66-477f-8255-677d70a111a1\") " pod="openstack/nova-metadata-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.108843 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3208c93b-4d66-477f-8255-677d70a111a1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3208c93b-4d66-477f-8255-677d70a111a1\") " pod="openstack/nova-metadata-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.108908 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/96557def-81a7-44a3-86d4-72e10daa7d68-public-tls-certs\") pod \"nova-api-0\" (UID: \"96557def-81a7-44a3-86d4-72e10daa7d68\") " pod="openstack/nova-api-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.108937 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3208c93b-4d66-477f-8255-677d70a111a1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3208c93b-4d66-477f-8255-677d70a111a1\") " pod="openstack/nova-metadata-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.109013 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96557def-81a7-44a3-86d4-72e10daa7d68-config-data\") pod \"nova-api-0\" (UID: \"96557def-81a7-44a3-86d4-72e10daa7d68\") " pod="openstack/nova-api-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.109043 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3208c93b-4d66-477f-8255-677d70a111a1-logs\") pod \"nova-metadata-0\" (UID: \"3208c93b-4d66-477f-8255-677d70a111a1\") " pod="openstack/nova-metadata-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.109063 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96557def-81a7-44a3-86d4-72e10daa7d68-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"96557def-81a7-44a3-86d4-72e10daa7d68\") " pod="openstack/nova-api-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.109216 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkmvt\" (UniqueName: \"kubernetes.io/projected/96557def-81a7-44a3-86d4-72e10daa7d68-kube-api-access-gkmvt\") pod \"nova-api-0\" (UID: \"96557def-81a7-44a3-86d4-72e10daa7d68\") " pod="openstack/nova-api-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.109253 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3208c93b-4d66-477f-8255-677d70a111a1-config-data\") pod \"nova-metadata-0\" (UID: \"3208c93b-4d66-477f-8255-677d70a111a1\") " pod="openstack/nova-metadata-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.109352 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96557def-81a7-44a3-86d4-72e10daa7d68-logs\") pod \"nova-api-0\" (UID: \"96557def-81a7-44a3-86d4-72e10daa7d68\") " pod="openstack/nova-api-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.109519 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/96557def-81a7-44a3-86d4-72e10daa7d68-internal-tls-certs\") pod \"nova-api-0\" (UID: \"96557def-81a7-44a3-86d4-72e10daa7d68\") " pod="openstack/nova-api-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.211273 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96557def-81a7-44a3-86d4-72e10daa7d68-logs\") pod \"nova-api-0\" (UID: \"96557def-81a7-44a3-86d4-72e10daa7d68\") " pod="openstack/nova-api-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.211439 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/96557def-81a7-44a3-86d4-72e10daa7d68-internal-tls-certs\") pod \"nova-api-0\" (UID: \"96557def-81a7-44a3-86d4-72e10daa7d68\") " pod="openstack/nova-api-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.211505 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bplw\" (UniqueName: \"kubernetes.io/projected/3208c93b-4d66-477f-8255-677d70a111a1-kube-api-access-6bplw\") pod \"nova-metadata-0\" (UID: \"3208c93b-4d66-477f-8255-677d70a111a1\") " pod="openstack/nova-metadata-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.211539 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3208c93b-4d66-477f-8255-677d70a111a1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3208c93b-4d66-477f-8255-677d70a111a1\") " pod="openstack/nova-metadata-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.211581 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/96557def-81a7-44a3-86d4-72e10daa7d68-public-tls-certs\") pod \"nova-api-0\" (UID: \"96557def-81a7-44a3-86d4-72e10daa7d68\") " pod="openstack/nova-api-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.211617 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3208c93b-4d66-477f-8255-677d70a111a1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3208c93b-4d66-477f-8255-677d70a111a1\") " pod="openstack/nova-metadata-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.211672 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96557def-81a7-44a3-86d4-72e10daa7d68-config-data\") pod \"nova-api-0\" (UID: \"96557def-81a7-44a3-86d4-72e10daa7d68\") " pod="openstack/nova-api-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.211719 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3208c93b-4d66-477f-8255-677d70a111a1-logs\") pod \"nova-metadata-0\" (UID: \"3208c93b-4d66-477f-8255-677d70a111a1\") " pod="openstack/nova-metadata-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.211777 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96557def-81a7-44a3-86d4-72e10daa7d68-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"96557def-81a7-44a3-86d4-72e10daa7d68\") " pod="openstack/nova-api-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.212051 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3208c93b-4d66-477f-8255-677d70a111a1-config-data\") pod \"nova-metadata-0\" (UID: \"3208c93b-4d66-477f-8255-677d70a111a1\") " pod="openstack/nova-metadata-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.212083 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkmvt\" (UniqueName: \"kubernetes.io/projected/96557def-81a7-44a3-86d4-72e10daa7d68-kube-api-access-gkmvt\") pod \"nova-api-0\" (UID: \"96557def-81a7-44a3-86d4-72e10daa7d68\") " pod="openstack/nova-api-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.212128 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96557def-81a7-44a3-86d4-72e10daa7d68-logs\") pod \"nova-api-0\" (UID: \"96557def-81a7-44a3-86d4-72e10daa7d68\") " pod="openstack/nova-api-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.212533 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3208c93b-4d66-477f-8255-677d70a111a1-logs\") pod \"nova-metadata-0\" (UID: \"3208c93b-4d66-477f-8255-677d70a111a1\") " pod="openstack/nova-metadata-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.216964 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/96557def-81a7-44a3-86d4-72e10daa7d68-public-tls-certs\") pod \"nova-api-0\" (UID: \"96557def-81a7-44a3-86d4-72e10daa7d68\") " pod="openstack/nova-api-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.217267 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3208c93b-4d66-477f-8255-677d70a111a1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3208c93b-4d66-477f-8255-677d70a111a1\") " pod="openstack/nova-metadata-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.217496 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96557def-81a7-44a3-86d4-72e10daa7d68-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"96557def-81a7-44a3-86d4-72e10daa7d68\") " pod="openstack/nova-api-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.226796 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3208c93b-4d66-477f-8255-677d70a111a1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3208c93b-4d66-477f-8255-677d70a111a1\") " pod="openstack/nova-metadata-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.227397 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96557def-81a7-44a3-86d4-72e10daa7d68-config-data\") pod \"nova-api-0\" (UID: \"96557def-81a7-44a3-86d4-72e10daa7d68\") " pod="openstack/nova-api-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.227534 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/96557def-81a7-44a3-86d4-72e10daa7d68-internal-tls-certs\") pod \"nova-api-0\" (UID: \"96557def-81a7-44a3-86d4-72e10daa7d68\") " pod="openstack/nova-api-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.227909 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3208c93b-4d66-477f-8255-677d70a111a1-config-data\") pod \"nova-metadata-0\" (UID: \"3208c93b-4d66-477f-8255-677d70a111a1\") " pod="openstack/nova-metadata-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.229543 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkmvt\" (UniqueName: \"kubernetes.io/projected/96557def-81a7-44a3-86d4-72e10daa7d68-kube-api-access-gkmvt\") pod \"nova-api-0\" (UID: \"96557def-81a7-44a3-86d4-72e10daa7d68\") " pod="openstack/nova-api-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.229965 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bplw\" (UniqueName: \"kubernetes.io/projected/3208c93b-4d66-477f-8255-677d70a111a1-kube-api-access-6bplw\") pod \"nova-metadata-0\" (UID: \"3208c93b-4d66-477f-8255-677d70a111a1\") " pod="openstack/nova-metadata-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.334605 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d6f2892-c5c3-4552-b7e6-2cc10d92afb3" path="/var/lib/kubelet/pods/0d6f2892-c5c3-4552-b7e6-2cc10d92afb3/volumes" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.335440 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a062c05-477a-48c4-be23-ec0690970699" path="/var/lib/kubelet/pods/2a062c05-477a-48c4-be23-ec0690970699/volumes" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.347212 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.356801 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.819478 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 15:03:53 crc kubenswrapper[4774]: W1003 15:03:53.830509 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96557def_81a7_44a3_86d4_72e10daa7d68.slice/crio-05a09ad28350682b77e4c8e0b2f0fc4495158e43aafdcb41c6704169c9271087 WatchSource:0}: Error finding container 05a09ad28350682b77e4c8e0b2f0fc4495158e43aafdcb41c6704169c9271087: Status 404 returned error can't find the container with id 05a09ad28350682b77e4c8e0b2f0fc4495158e43aafdcb41c6704169c9271087 Oct 03 15:03:53 crc kubenswrapper[4774]: I1003 15:03:53.873177 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 15:03:53 crc kubenswrapper[4774]: W1003 15:03:53.880932 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3208c93b_4d66_477f_8255_677d70a111a1.slice/crio-dc38131c7da5b69d63d21751aaaf5e1305f1756e9feff49f08316eff2ff66cfa WatchSource:0}: Error finding container dc38131c7da5b69d63d21751aaaf5e1305f1756e9feff49f08316eff2ff66cfa: Status 404 returned error can't find the container with id dc38131c7da5b69d63d21751aaaf5e1305f1756e9feff49f08316eff2ff66cfa Oct 03 15:03:54 crc kubenswrapper[4774]: I1003 15:03:54.589717 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3208c93b-4d66-477f-8255-677d70a111a1","Type":"ContainerStarted","Data":"7f0a021df099a152214544783fd66aff2e0d00c724057c9ef46b661628770823"} Oct 03 15:03:54 crc kubenswrapper[4774]: I1003 15:03:54.590080 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3208c93b-4d66-477f-8255-677d70a111a1","Type":"ContainerStarted","Data":"e756122247b72705d0af659cd16c1e686baed2e8fe345750161942dbb5d8237d"} Oct 03 15:03:54 crc kubenswrapper[4774]: I1003 15:03:54.590099 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3208c93b-4d66-477f-8255-677d70a111a1","Type":"ContainerStarted","Data":"dc38131c7da5b69d63d21751aaaf5e1305f1756e9feff49f08316eff2ff66cfa"} Oct 03 15:03:54 crc kubenswrapper[4774]: I1003 15:03:54.595501 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"96557def-81a7-44a3-86d4-72e10daa7d68","Type":"ContainerStarted","Data":"1fa858bcae0a5cc8e4e16aa28001244db89f531fafbac3ed6f9847c4e234998a"} Oct 03 15:03:54 crc kubenswrapper[4774]: I1003 15:03:54.595830 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"96557def-81a7-44a3-86d4-72e10daa7d68","Type":"ContainerStarted","Data":"c73b7cc5ba06276bf7c6180efda21b6e5feb4bbaea6359e3912bfe46b3cd7e72"} Oct 03 15:03:54 crc kubenswrapper[4774]: I1003 15:03:54.596047 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"96557def-81a7-44a3-86d4-72e10daa7d68","Type":"ContainerStarted","Data":"05a09ad28350682b77e4c8e0b2f0fc4495158e43aafdcb41c6704169c9271087"} Oct 03 15:03:54 crc kubenswrapper[4774]: I1003 15:03:54.613984 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.613967826 podStartE2EDuration="2.613967826s" podCreationTimestamp="2025-10-03 15:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:03:54.61170247 +0000 UTC m=+1257.200905922" watchObservedRunningTime="2025-10-03 15:03:54.613967826 +0000 UTC m=+1257.203171278" Oct 03 15:03:54 crc kubenswrapper[4774]: I1003 15:03:54.637735 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.637720565 podStartE2EDuration="2.637720565s" podCreationTimestamp="2025-10-03 15:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:03:54.636163096 +0000 UTC m=+1257.225366548" watchObservedRunningTime="2025-10-03 15:03:54.637720565 +0000 UTC m=+1257.226924017" Oct 03 15:03:55 crc kubenswrapper[4774]: I1003 15:03:55.263366 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 03 15:03:58 crc kubenswrapper[4774]: I1003 15:03:58.348168 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 15:03:58 crc kubenswrapper[4774]: I1003 15:03:58.348725 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 15:04:00 crc kubenswrapper[4774]: I1003 15:04:00.263361 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 03 15:04:00 crc kubenswrapper[4774]: I1003 15:04:00.297278 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 03 15:04:00 crc kubenswrapper[4774]: I1003 15:04:00.700083 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 03 15:04:03 crc kubenswrapper[4774]: I1003 15:04:03.347509 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 15:04:03 crc kubenswrapper[4774]: I1003 15:04:03.347779 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 15:04:03 crc kubenswrapper[4774]: I1003 15:04:03.357747 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 15:04:03 crc kubenswrapper[4774]: I1003 15:04:03.357789 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 15:04:04 crc kubenswrapper[4774]: I1003 15:04:04.366627 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3208c93b-4d66-477f-8255-677d70a111a1" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 15:04:04 crc kubenswrapper[4774]: I1003 15:04:04.366609 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3208c93b-4d66-477f-8255-677d70a111a1" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 15:04:04 crc kubenswrapper[4774]: I1003 15:04:04.387540 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="96557def-81a7-44a3-86d4-72e10daa7d68" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.207:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 15:04:04 crc kubenswrapper[4774]: I1003 15:04:04.387562 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="96557def-81a7-44a3-86d4-72e10daa7d68" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.207:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 15:04:06 crc kubenswrapper[4774]: I1003 15:04:06.672021 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 03 15:04:13 crc kubenswrapper[4774]: I1003 15:04:13.355990 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 15:04:13 crc kubenswrapper[4774]: I1003 15:04:13.356690 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 15:04:13 crc kubenswrapper[4774]: I1003 15:04:13.363105 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 15:04:13 crc kubenswrapper[4774]: I1003 15:04:13.363669 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 15:04:13 crc kubenswrapper[4774]: I1003 15:04:13.364955 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 15:04:13 crc kubenswrapper[4774]: I1003 15:04:13.365496 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 15:04:13 crc kubenswrapper[4774]: I1003 15:04:13.368286 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 15:04:13 crc kubenswrapper[4774]: I1003 15:04:13.370438 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 15:04:13 crc kubenswrapper[4774]: I1003 15:04:13.829444 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 15:04:13 crc kubenswrapper[4774]: I1003 15:04:13.836031 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 15:04:21 crc kubenswrapper[4774]: I1003 15:04:21.490757 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 15:04:23 crc kubenswrapper[4774]: I1003 15:04:23.019887 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 15:04:25 crc kubenswrapper[4774]: I1003 15:04:25.543652 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="7fa97e79-a30c-4722-b02b-ec5494bd057c" containerName="rabbitmq" containerID="cri-o://b9c1f78fe703fc2ec1c54dca1c5970e4e1d1cd786f37bb24fddfa058335a6a97" gracePeriod=604796 Oct 03 15:04:27 crc kubenswrapper[4774]: I1003 15:04:27.458772 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="0a0a516a-bd97-4484-802b-71eb14f3ca3f" containerName="rabbitmq" containerID="cri-o://4f460fb22aff9e6a313dab120e473cd831b1efea3d93c5bf04659fe9ff70d964" gracePeriod=604796 Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.036217 4774 generic.go:334] "Generic (PLEG): container finished" podID="7fa97e79-a30c-4722-b02b-ec5494bd057c" containerID="b9c1f78fe703fc2ec1c54dca1c5970e4e1d1cd786f37bb24fddfa058335a6a97" exitCode=0 Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.036574 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7fa97e79-a30c-4722-b02b-ec5494bd057c","Type":"ContainerDied","Data":"b9c1f78fe703fc2ec1c54dca1c5970e4e1d1cd786f37bb24fddfa058335a6a97"} Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.176502 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.355405 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"7fa97e79-a30c-4722-b02b-ec5494bd057c\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.355502 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7fa97e79-a30c-4722-b02b-ec5494bd057c-pod-info\") pod \"7fa97e79-a30c-4722-b02b-ec5494bd057c\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.355529 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7fa97e79-a30c-4722-b02b-ec5494bd057c-config-data\") pod \"7fa97e79-a30c-4722-b02b-ec5494bd057c\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.355557 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7fa97e79-a30c-4722-b02b-ec5494bd057c-rabbitmq-tls\") pod \"7fa97e79-a30c-4722-b02b-ec5494bd057c\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.355637 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7fa97e79-a30c-4722-b02b-ec5494bd057c-rabbitmq-erlang-cookie\") pod \"7fa97e79-a30c-4722-b02b-ec5494bd057c\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.355712 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7fa97e79-a30c-4722-b02b-ec5494bd057c-erlang-cookie-secret\") pod \"7fa97e79-a30c-4722-b02b-ec5494bd057c\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.355856 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7fa97e79-a30c-4722-b02b-ec5494bd057c-rabbitmq-plugins\") pod \"7fa97e79-a30c-4722-b02b-ec5494bd057c\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.355880 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5r22\" (UniqueName: \"kubernetes.io/projected/7fa97e79-a30c-4722-b02b-ec5494bd057c-kube-api-access-b5r22\") pod \"7fa97e79-a30c-4722-b02b-ec5494bd057c\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.355914 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7fa97e79-a30c-4722-b02b-ec5494bd057c-server-conf\") pod \"7fa97e79-a30c-4722-b02b-ec5494bd057c\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.355949 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7fa97e79-a30c-4722-b02b-ec5494bd057c-plugins-conf\") pod \"7fa97e79-a30c-4722-b02b-ec5494bd057c\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.355974 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7fa97e79-a30c-4722-b02b-ec5494bd057c-rabbitmq-confd\") pod \"7fa97e79-a30c-4722-b02b-ec5494bd057c\" (UID: \"7fa97e79-a30c-4722-b02b-ec5494bd057c\") " Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.356329 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fa97e79-a30c-4722-b02b-ec5494bd057c-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "7fa97e79-a30c-4722-b02b-ec5494bd057c" (UID: "7fa97e79-a30c-4722-b02b-ec5494bd057c"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.356430 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fa97e79-a30c-4722-b02b-ec5494bd057c-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "7fa97e79-a30c-4722-b02b-ec5494bd057c" (UID: "7fa97e79-a30c-4722-b02b-ec5494bd057c"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.356591 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fa97e79-a30c-4722-b02b-ec5494bd057c-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "7fa97e79-a30c-4722-b02b-ec5494bd057c" (UID: "7fa97e79-a30c-4722-b02b-ec5494bd057c"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.356810 4774 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7fa97e79-a30c-4722-b02b-ec5494bd057c-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.356842 4774 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7fa97e79-a30c-4722-b02b-ec5494bd057c-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.356858 4774 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7fa97e79-a30c-4722-b02b-ec5494bd057c-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.362228 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fa97e79-a30c-4722-b02b-ec5494bd057c-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "7fa97e79-a30c-4722-b02b-ec5494bd057c" (UID: "7fa97e79-a30c-4722-b02b-ec5494bd057c"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.362239 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "7fa97e79-a30c-4722-b02b-ec5494bd057c" (UID: "7fa97e79-a30c-4722-b02b-ec5494bd057c"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.364490 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/7fa97e79-a30c-4722-b02b-ec5494bd057c-pod-info" (OuterVolumeSpecName: "pod-info") pod "7fa97e79-a30c-4722-b02b-ec5494bd057c" (UID: "7fa97e79-a30c-4722-b02b-ec5494bd057c"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.366753 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fa97e79-a30c-4722-b02b-ec5494bd057c-kube-api-access-b5r22" (OuterVolumeSpecName: "kube-api-access-b5r22") pod "7fa97e79-a30c-4722-b02b-ec5494bd057c" (UID: "7fa97e79-a30c-4722-b02b-ec5494bd057c"). InnerVolumeSpecName "kube-api-access-b5r22". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.371522 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fa97e79-a30c-4722-b02b-ec5494bd057c-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "7fa97e79-a30c-4722-b02b-ec5494bd057c" (UID: "7fa97e79-a30c-4722-b02b-ec5494bd057c"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.402253 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fa97e79-a30c-4722-b02b-ec5494bd057c-config-data" (OuterVolumeSpecName: "config-data") pod "7fa97e79-a30c-4722-b02b-ec5494bd057c" (UID: "7fa97e79-a30c-4722-b02b-ec5494bd057c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.424999 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fa97e79-a30c-4722-b02b-ec5494bd057c-server-conf" (OuterVolumeSpecName: "server-conf") pod "7fa97e79-a30c-4722-b02b-ec5494bd057c" (UID: "7fa97e79-a30c-4722-b02b-ec5494bd057c"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.459237 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5r22\" (UniqueName: \"kubernetes.io/projected/7fa97e79-a30c-4722-b02b-ec5494bd057c-kube-api-access-b5r22\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.459985 4774 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7fa97e79-a30c-4722-b02b-ec5494bd057c-server-conf\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.460035 4774 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.460182 4774 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7fa97e79-a30c-4722-b02b-ec5494bd057c-pod-info\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.460197 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7fa97e79-a30c-4722-b02b-ec5494bd057c-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.460211 4774 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7fa97e79-a30c-4722-b02b-ec5494bd057c-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.460223 4774 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7fa97e79-a30c-4722-b02b-ec5494bd057c-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.484461 4774 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.495484 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fa97e79-a30c-4722-b02b-ec5494bd057c-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "7fa97e79-a30c-4722-b02b-ec5494bd057c" (UID: "7fa97e79-a30c-4722-b02b-ec5494bd057c"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.561834 4774 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7fa97e79-a30c-4722-b02b-ec5494bd057c-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:32 crc kubenswrapper[4774]: I1003 15:04:32.562189 4774 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.048628 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7fa97e79-a30c-4722-b02b-ec5494bd057c","Type":"ContainerDied","Data":"fb486f9b13d9d4abe260a357aa01f2ec006746e3d1b973422ea45d6803516e4a"} Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.048672 4774 scope.go:117] "RemoveContainer" containerID="b9c1f78fe703fc2ec1c54dca1c5970e4e1d1cd786f37bb24fddfa058335a6a97" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.048802 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.079020 4774 scope.go:117] "RemoveContainer" containerID="ba4f05d232c8d413350b06f66392f7b2e2403d6d44ba76999902b2b487809df3" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.090842 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.103955 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.136432 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 15:04:33 crc kubenswrapper[4774]: E1003 15:04:33.136864 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fa97e79-a30c-4722-b02b-ec5494bd057c" containerName="setup-container" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.136882 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fa97e79-a30c-4722-b02b-ec5494bd057c" containerName="setup-container" Oct 03 15:04:33 crc kubenswrapper[4774]: E1003 15:04:33.136907 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fa97e79-a30c-4722-b02b-ec5494bd057c" containerName="rabbitmq" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.136914 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fa97e79-a30c-4722-b02b-ec5494bd057c" containerName="rabbitmq" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.141408 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fa97e79-a30c-4722-b02b-ec5494bd057c" containerName="rabbitmq" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.142540 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.148241 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.148530 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.148650 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.148740 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.148913 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-m6dkz" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.149459 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.149568 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.161599 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.273085 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6d6cfa86-4356-4d79-9edd-977355592186-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6d6cfa86-4356-4d79-9edd-977355592186\") " pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.273130 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6d6cfa86-4356-4d79-9edd-977355592186-config-data\") pod \"rabbitmq-server-0\" (UID: \"6d6cfa86-4356-4d79-9edd-977355592186\") " pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.273151 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6d6cfa86-4356-4d79-9edd-977355592186-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6d6cfa86-4356-4d79-9edd-977355592186\") " pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.273174 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6d6cfa86-4356-4d79-9edd-977355592186-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6d6cfa86-4356-4d79-9edd-977355592186\") " pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.273204 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6d6cfa86-4356-4d79-9edd-977355592186-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6d6cfa86-4356-4d79-9edd-977355592186\") " pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.273227 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6d6cfa86-4356-4d79-9edd-977355592186-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6d6cfa86-4356-4d79-9edd-977355592186\") " pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.273345 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"6d6cfa86-4356-4d79-9edd-977355592186\") " pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.273455 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6d6cfa86-4356-4d79-9edd-977355592186-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6d6cfa86-4356-4d79-9edd-977355592186\") " pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.273488 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdq8v\" (UniqueName: \"kubernetes.io/projected/6d6cfa86-4356-4d79-9edd-977355592186-kube-api-access-tdq8v\") pod \"rabbitmq-server-0\" (UID: \"6d6cfa86-4356-4d79-9edd-977355592186\") " pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.273544 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6d6cfa86-4356-4d79-9edd-977355592186-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6d6cfa86-4356-4d79-9edd-977355592186\") " pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.273607 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6d6cfa86-4356-4d79-9edd-977355592186-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6d6cfa86-4356-4d79-9edd-977355592186\") " pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.332407 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fa97e79-a30c-4722-b02b-ec5494bd057c" path="/var/lib/kubelet/pods/7fa97e79-a30c-4722-b02b-ec5494bd057c/volumes" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.374858 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6d6cfa86-4356-4d79-9edd-977355592186-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6d6cfa86-4356-4d79-9edd-977355592186\") " pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.374942 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6d6cfa86-4356-4d79-9edd-977355592186-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6d6cfa86-4356-4d79-9edd-977355592186\") " pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.374975 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"6d6cfa86-4356-4d79-9edd-977355592186\") " pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.375002 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6d6cfa86-4356-4d79-9edd-977355592186-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6d6cfa86-4356-4d79-9edd-977355592186\") " pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.375018 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdq8v\" (UniqueName: \"kubernetes.io/projected/6d6cfa86-4356-4d79-9edd-977355592186-kube-api-access-tdq8v\") pod \"rabbitmq-server-0\" (UID: \"6d6cfa86-4356-4d79-9edd-977355592186\") " pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.375042 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6d6cfa86-4356-4d79-9edd-977355592186-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6d6cfa86-4356-4d79-9edd-977355592186\") " pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.375064 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6d6cfa86-4356-4d79-9edd-977355592186-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6d6cfa86-4356-4d79-9edd-977355592186\") " pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.375139 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6d6cfa86-4356-4d79-9edd-977355592186-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6d6cfa86-4356-4d79-9edd-977355592186\") " pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.375156 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6d6cfa86-4356-4d79-9edd-977355592186-config-data\") pod \"rabbitmq-server-0\" (UID: \"6d6cfa86-4356-4d79-9edd-977355592186\") " pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.375173 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6d6cfa86-4356-4d79-9edd-977355592186-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6d6cfa86-4356-4d79-9edd-977355592186\") " pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.375193 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6d6cfa86-4356-4d79-9edd-977355592186-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6d6cfa86-4356-4d79-9edd-977355592186\") " pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.376174 4774 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"6d6cfa86-4356-4d79-9edd-977355592186\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.377030 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6d6cfa86-4356-4d79-9edd-977355592186-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6d6cfa86-4356-4d79-9edd-977355592186\") " pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.377059 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6d6cfa86-4356-4d79-9edd-977355592186-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6d6cfa86-4356-4d79-9edd-977355592186\") " pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.377287 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6d6cfa86-4356-4d79-9edd-977355592186-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6d6cfa86-4356-4d79-9edd-977355592186\") " pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.377851 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6d6cfa86-4356-4d79-9edd-977355592186-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6d6cfa86-4356-4d79-9edd-977355592186\") " pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.377952 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6d6cfa86-4356-4d79-9edd-977355592186-config-data\") pod \"rabbitmq-server-0\" (UID: \"6d6cfa86-4356-4d79-9edd-977355592186\") " pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.380835 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6d6cfa86-4356-4d79-9edd-977355592186-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6d6cfa86-4356-4d79-9edd-977355592186\") " pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.381482 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6d6cfa86-4356-4d79-9edd-977355592186-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6d6cfa86-4356-4d79-9edd-977355592186\") " pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.388113 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6d6cfa86-4356-4d79-9edd-977355592186-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6d6cfa86-4356-4d79-9edd-977355592186\") " pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.388529 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6d6cfa86-4356-4d79-9edd-977355592186-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6d6cfa86-4356-4d79-9edd-977355592186\") " pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.398496 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdq8v\" (UniqueName: \"kubernetes.io/projected/6d6cfa86-4356-4d79-9edd-977355592186-kube-api-access-tdq8v\") pod \"rabbitmq-server-0\" (UID: \"6d6cfa86-4356-4d79-9edd-977355592186\") " pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.409220 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"6d6cfa86-4356-4d79-9edd-977355592186\") " pod="openstack/rabbitmq-server-0" Oct 03 15:04:33 crc kubenswrapper[4774]: I1003 15:04:33.487816 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:33.795482 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="0a0a516a-bd97-4484-802b-71eb14f3ca3f" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:33.965484 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.058730 4774 generic.go:334] "Generic (PLEG): container finished" podID="0a0a516a-bd97-4484-802b-71eb14f3ca3f" containerID="4f460fb22aff9e6a313dab120e473cd831b1efea3d93c5bf04659fe9ff70d964" exitCode=0 Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.058798 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0a0a516a-bd97-4484-802b-71eb14f3ca3f","Type":"ContainerDied","Data":"4f460fb22aff9e6a313dab120e473cd831b1efea3d93c5bf04659fe9ff70d964"} Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.059879 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6d6cfa86-4356-4d79-9edd-977355592186","Type":"ContainerStarted","Data":"b216dbb159db98e678e5c213382a392762c9db9803eca7e23e3b04c4dd90516b"} Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.677253 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.801511 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0a0a516a-bd97-4484-802b-71eb14f3ca3f-erlang-cookie-secret\") pod \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.802529 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0a0a516a-bd97-4484-802b-71eb14f3ca3f-pod-info\") pod \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.802584 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0a0a516a-bd97-4484-802b-71eb14f3ca3f-server-conf\") pod \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.802805 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw276\" (UniqueName: \"kubernetes.io/projected/0a0a516a-bd97-4484-802b-71eb14f3ca3f-kube-api-access-vw276\") pod \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.803425 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.803608 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0a0a516a-bd97-4484-802b-71eb14f3ca3f-rabbitmq-plugins\") pod \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.803698 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0a0a516a-bd97-4484-802b-71eb14f3ca3f-rabbitmq-erlang-cookie\") pod \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.803734 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0a0a516a-bd97-4484-802b-71eb14f3ca3f-config-data\") pod \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.803827 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0a0a516a-bd97-4484-802b-71eb14f3ca3f-rabbitmq-tls\") pod \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.803856 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0a0a516a-bd97-4484-802b-71eb14f3ca3f-plugins-conf\") pod \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.804574 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0a0a516a-bd97-4484-802b-71eb14f3ca3f-rabbitmq-confd\") pod \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\" (UID: \"0a0a516a-bd97-4484-802b-71eb14f3ca3f\") " Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.804358 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a0a516a-bd97-4484-802b-71eb14f3ca3f-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "0a0a516a-bd97-4484-802b-71eb14f3ca3f" (UID: "0a0a516a-bd97-4484-802b-71eb14f3ca3f"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.804580 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a0a516a-bd97-4484-802b-71eb14f3ca3f-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "0a0a516a-bd97-4484-802b-71eb14f3ca3f" (UID: "0a0a516a-bd97-4484-802b-71eb14f3ca3f"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.804719 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a0a516a-bd97-4484-802b-71eb14f3ca3f-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "0a0a516a-bd97-4484-802b-71eb14f3ca3f" (UID: "0a0a516a-bd97-4484-802b-71eb14f3ca3f"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.805077 4774 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0a0a516a-bd97-4484-802b-71eb14f3ca3f-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.805098 4774 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0a0a516a-bd97-4484-802b-71eb14f3ca3f-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.805113 4774 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0a0a516a-bd97-4484-802b-71eb14f3ca3f-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.808788 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a0a516a-bd97-4484-802b-71eb14f3ca3f-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "0a0a516a-bd97-4484-802b-71eb14f3ca3f" (UID: "0a0a516a-bd97-4484-802b-71eb14f3ca3f"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.809213 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a0a516a-bd97-4484-802b-71eb14f3ca3f-kube-api-access-vw276" (OuterVolumeSpecName: "kube-api-access-vw276") pod "0a0a516a-bd97-4484-802b-71eb14f3ca3f" (UID: "0a0a516a-bd97-4484-802b-71eb14f3ca3f"). InnerVolumeSpecName "kube-api-access-vw276". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.809942 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a0a516a-bd97-4484-802b-71eb14f3ca3f-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "0a0a516a-bd97-4484-802b-71eb14f3ca3f" (UID: "0a0a516a-bd97-4484-802b-71eb14f3ca3f"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.821493 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "persistence") pod "0a0a516a-bd97-4484-802b-71eb14f3ca3f" (UID: "0a0a516a-bd97-4484-802b-71eb14f3ca3f"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.821655 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/0a0a516a-bd97-4484-802b-71eb14f3ca3f-pod-info" (OuterVolumeSpecName: "pod-info") pod "0a0a516a-bd97-4484-802b-71eb14f3ca3f" (UID: "0a0a516a-bd97-4484-802b-71eb14f3ca3f"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.834427 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a0a516a-bd97-4484-802b-71eb14f3ca3f-config-data" (OuterVolumeSpecName: "config-data") pod "0a0a516a-bd97-4484-802b-71eb14f3ca3f" (UID: "0a0a516a-bd97-4484-802b-71eb14f3ca3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.872849 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a0a516a-bd97-4484-802b-71eb14f3ca3f-server-conf" (OuterVolumeSpecName: "server-conf") pod "0a0a516a-bd97-4484-802b-71eb14f3ca3f" (UID: "0a0a516a-bd97-4484-802b-71eb14f3ca3f"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.906745 4774 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0a0a516a-bd97-4484-802b-71eb14f3ca3f-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.906794 4774 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0a0a516a-bd97-4484-802b-71eb14f3ca3f-pod-info\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.906813 4774 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0a0a516a-bd97-4484-802b-71eb14f3ca3f-server-conf\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.906830 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vw276\" (UniqueName: \"kubernetes.io/projected/0a0a516a-bd97-4484-802b-71eb14f3ca3f-kube-api-access-vw276\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.906876 4774 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.906893 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0a0a516a-bd97-4484-802b-71eb14f3ca3f-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.906909 4774 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0a0a516a-bd97-4484-802b-71eb14f3ca3f-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:34 crc kubenswrapper[4774]: I1003 15:04:34.934679 4774 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.008051 4774 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.041261 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a0a516a-bd97-4484-802b-71eb14f3ca3f-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "0a0a516a-bd97-4484-802b-71eb14f3ca3f" (UID: "0a0a516a-bd97-4484-802b-71eb14f3ca3f"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.072621 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0a0a516a-bd97-4484-802b-71eb14f3ca3f","Type":"ContainerDied","Data":"0c2a85bc4b943600c668b1dac977ad3cbb5963f2c4256a283fbe1ed55e791e64"} Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.072673 4774 scope.go:117] "RemoveContainer" containerID="4f460fb22aff9e6a313dab120e473cd831b1efea3d93c5bf04659fe9ff70d964" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.072799 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.112575 4774 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0a0a516a-bd97-4484-802b-71eb14f3ca3f-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.216008 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.218340 4774 scope.go:117] "RemoveContainer" containerID="bf3375d50c95a1b4c96eaea4067faf1aa3e5e031209fd4b5cd965560980c24ae" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.224700 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.248763 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 15:04:35 crc kubenswrapper[4774]: E1003 15:04:35.249279 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a0a516a-bd97-4484-802b-71eb14f3ca3f" containerName="rabbitmq" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.249299 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a0a516a-bd97-4484-802b-71eb14f3ca3f" containerName="rabbitmq" Oct 03 15:04:35 crc kubenswrapper[4774]: E1003 15:04:35.249323 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a0a516a-bd97-4484-802b-71eb14f3ca3f" containerName="setup-container" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.249330 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a0a516a-bd97-4484-802b-71eb14f3ca3f" containerName="setup-container" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.249628 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a0a516a-bd97-4484-802b-71eb14f3ca3f" containerName="rabbitmq" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.250826 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.254354 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.254566 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.254709 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.254846 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.254990 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-2nlxm" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.255136 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.257304 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.258689 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.313870 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a0a516a-bd97-4484-802b-71eb14f3ca3f" path="/var/lib/kubelet/pods/0a0a516a-bd97-4484-802b-71eb14f3ca3f/volumes" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.418754 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/417bcf92-1c5e-4977-a197-62b603b795a2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"417bcf92-1c5e-4977-a197-62b603b795a2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.418809 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/417bcf92-1c5e-4977-a197-62b603b795a2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"417bcf92-1c5e-4977-a197-62b603b795a2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.418854 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/417bcf92-1c5e-4977-a197-62b603b795a2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"417bcf92-1c5e-4977-a197-62b603b795a2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.418992 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/417bcf92-1c5e-4977-a197-62b603b795a2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"417bcf92-1c5e-4977-a197-62b603b795a2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.419058 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/417bcf92-1c5e-4977-a197-62b603b795a2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"417bcf92-1c5e-4977-a197-62b603b795a2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.419119 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"417bcf92-1c5e-4977-a197-62b603b795a2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.419155 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/417bcf92-1c5e-4977-a197-62b603b795a2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"417bcf92-1c5e-4977-a197-62b603b795a2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.419179 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/417bcf92-1c5e-4977-a197-62b603b795a2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"417bcf92-1c5e-4977-a197-62b603b795a2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.419202 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhfjr\" (UniqueName: \"kubernetes.io/projected/417bcf92-1c5e-4977-a197-62b603b795a2-kube-api-access-nhfjr\") pod \"rabbitmq-cell1-server-0\" (UID: \"417bcf92-1c5e-4977-a197-62b603b795a2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.419411 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/417bcf92-1c5e-4977-a197-62b603b795a2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"417bcf92-1c5e-4977-a197-62b603b795a2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.419473 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/417bcf92-1c5e-4977-a197-62b603b795a2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"417bcf92-1c5e-4977-a197-62b603b795a2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.521274 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/417bcf92-1c5e-4977-a197-62b603b795a2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"417bcf92-1c5e-4977-a197-62b603b795a2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.521320 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/417bcf92-1c5e-4977-a197-62b603b795a2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"417bcf92-1c5e-4977-a197-62b603b795a2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.521364 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"417bcf92-1c5e-4977-a197-62b603b795a2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.521405 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/417bcf92-1c5e-4977-a197-62b603b795a2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"417bcf92-1c5e-4977-a197-62b603b795a2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.521429 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/417bcf92-1c5e-4977-a197-62b603b795a2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"417bcf92-1c5e-4977-a197-62b603b795a2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.521445 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhfjr\" (UniqueName: \"kubernetes.io/projected/417bcf92-1c5e-4977-a197-62b603b795a2-kube-api-access-nhfjr\") pod \"rabbitmq-cell1-server-0\" (UID: \"417bcf92-1c5e-4977-a197-62b603b795a2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.521491 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/417bcf92-1c5e-4977-a197-62b603b795a2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"417bcf92-1c5e-4977-a197-62b603b795a2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.521525 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/417bcf92-1c5e-4977-a197-62b603b795a2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"417bcf92-1c5e-4977-a197-62b603b795a2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.521543 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/417bcf92-1c5e-4977-a197-62b603b795a2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"417bcf92-1c5e-4977-a197-62b603b795a2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.521561 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/417bcf92-1c5e-4977-a197-62b603b795a2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"417bcf92-1c5e-4977-a197-62b603b795a2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.521584 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/417bcf92-1c5e-4977-a197-62b603b795a2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"417bcf92-1c5e-4977-a197-62b603b795a2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.521707 4774 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"417bcf92-1c5e-4977-a197-62b603b795a2\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.522501 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/417bcf92-1c5e-4977-a197-62b603b795a2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"417bcf92-1c5e-4977-a197-62b603b795a2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.522730 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/417bcf92-1c5e-4977-a197-62b603b795a2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"417bcf92-1c5e-4977-a197-62b603b795a2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.522882 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/417bcf92-1c5e-4977-a197-62b603b795a2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"417bcf92-1c5e-4977-a197-62b603b795a2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.523049 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/417bcf92-1c5e-4977-a197-62b603b795a2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"417bcf92-1c5e-4977-a197-62b603b795a2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.523216 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/417bcf92-1c5e-4977-a197-62b603b795a2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"417bcf92-1c5e-4977-a197-62b603b795a2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.524984 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/417bcf92-1c5e-4977-a197-62b603b795a2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"417bcf92-1c5e-4977-a197-62b603b795a2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.525413 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/417bcf92-1c5e-4977-a197-62b603b795a2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"417bcf92-1c5e-4977-a197-62b603b795a2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.525591 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/417bcf92-1c5e-4977-a197-62b603b795a2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"417bcf92-1c5e-4977-a197-62b603b795a2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.527640 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/417bcf92-1c5e-4977-a197-62b603b795a2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"417bcf92-1c5e-4977-a197-62b603b795a2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.538355 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhfjr\" (UniqueName: \"kubernetes.io/projected/417bcf92-1c5e-4977-a197-62b603b795a2-kube-api-access-nhfjr\") pod \"rabbitmq-cell1-server-0\" (UID: \"417bcf92-1c5e-4977-a197-62b603b795a2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.551198 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"417bcf92-1c5e-4977-a197-62b603b795a2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.575711 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:04:35 crc kubenswrapper[4774]: W1003 15:04:35.838451 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod417bcf92_1c5e_4977_a197_62b603b795a2.slice/crio-153297402c3463459d139a30401fb7b3fc1476859c36bda4d1d894d917c53212 WatchSource:0}: Error finding container 153297402c3463459d139a30401fb7b3fc1476859c36bda4d1d894d917c53212: Status 404 returned error can't find the container with id 153297402c3463459d139a30401fb7b3fc1476859c36bda4d1d894d917c53212 Oct 03 15:04:35 crc kubenswrapper[4774]: I1003 15:04:35.839082 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 15:04:36 crc kubenswrapper[4774]: I1003 15:04:36.085133 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6d6cfa86-4356-4d79-9edd-977355592186","Type":"ContainerStarted","Data":"24ed5b3bb865fde0246fcf5b8f4b83daa696a05b41c04220d3fe3beb29b17384"} Oct 03 15:04:36 crc kubenswrapper[4774]: I1003 15:04:36.086982 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"417bcf92-1c5e-4977-a197-62b603b795a2","Type":"ContainerStarted","Data":"153297402c3463459d139a30401fb7b3fc1476859c36bda4d1d894d917c53212"} Oct 03 15:04:36 crc kubenswrapper[4774]: I1003 15:04:36.912026 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-mtmsn"] Oct 03 15:04:36 crc kubenswrapper[4774]: I1003 15:04:36.914668 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-mtmsn" Oct 03 15:04:36 crc kubenswrapper[4774]: I1003 15:04:36.916778 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 03 15:04:36 crc kubenswrapper[4774]: I1003 15:04:36.926569 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-mtmsn"] Oct 03 15:04:37 crc kubenswrapper[4774]: I1003 15:04:37.053242 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4t6x\" (UniqueName: \"kubernetes.io/projected/01cdd1f4-191a-4859-a27c-0b65d33ade12-kube-api-access-p4t6x\") pod \"dnsmasq-dns-67b789f86c-mtmsn\" (UID: \"01cdd1f4-191a-4859-a27c-0b65d33ade12\") " pod="openstack/dnsmasq-dns-67b789f86c-mtmsn" Oct 03 15:04:37 crc kubenswrapper[4774]: I1003 15:04:37.053304 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-mtmsn\" (UID: \"01cdd1f4-191a-4859-a27c-0b65d33ade12\") " pod="openstack/dnsmasq-dns-67b789f86c-mtmsn" Oct 03 15:04:37 crc kubenswrapper[4774]: I1003 15:04:37.053344 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-mtmsn\" (UID: \"01cdd1f4-191a-4859-a27c-0b65d33ade12\") " pod="openstack/dnsmasq-dns-67b789f86c-mtmsn" Oct 03 15:04:37 crc kubenswrapper[4774]: I1003 15:04:37.053388 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-config\") pod \"dnsmasq-dns-67b789f86c-mtmsn\" (UID: \"01cdd1f4-191a-4859-a27c-0b65d33ade12\") " pod="openstack/dnsmasq-dns-67b789f86c-mtmsn" Oct 03 15:04:37 crc kubenswrapper[4774]: I1003 15:04:37.053410 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-mtmsn\" (UID: \"01cdd1f4-191a-4859-a27c-0b65d33ade12\") " pod="openstack/dnsmasq-dns-67b789f86c-mtmsn" Oct 03 15:04:37 crc kubenswrapper[4774]: I1003 15:04:37.053501 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-dns-svc\") pod \"dnsmasq-dns-67b789f86c-mtmsn\" (UID: \"01cdd1f4-191a-4859-a27c-0b65d33ade12\") " pod="openstack/dnsmasq-dns-67b789f86c-mtmsn" Oct 03 15:04:37 crc kubenswrapper[4774]: I1003 15:04:37.053617 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-mtmsn\" (UID: \"01cdd1f4-191a-4859-a27c-0b65d33ade12\") " pod="openstack/dnsmasq-dns-67b789f86c-mtmsn" Oct 03 15:04:37 crc kubenswrapper[4774]: I1003 15:04:37.154901 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-mtmsn\" (UID: \"01cdd1f4-191a-4859-a27c-0b65d33ade12\") " pod="openstack/dnsmasq-dns-67b789f86c-mtmsn" Oct 03 15:04:37 crc kubenswrapper[4774]: I1003 15:04:37.154986 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4t6x\" (UniqueName: \"kubernetes.io/projected/01cdd1f4-191a-4859-a27c-0b65d33ade12-kube-api-access-p4t6x\") pod \"dnsmasq-dns-67b789f86c-mtmsn\" (UID: \"01cdd1f4-191a-4859-a27c-0b65d33ade12\") " pod="openstack/dnsmasq-dns-67b789f86c-mtmsn" Oct 03 15:04:37 crc kubenswrapper[4774]: I1003 15:04:37.155061 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-mtmsn\" (UID: \"01cdd1f4-191a-4859-a27c-0b65d33ade12\") " pod="openstack/dnsmasq-dns-67b789f86c-mtmsn" Oct 03 15:04:37 crc kubenswrapper[4774]: I1003 15:04:37.155119 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-mtmsn\" (UID: \"01cdd1f4-191a-4859-a27c-0b65d33ade12\") " pod="openstack/dnsmasq-dns-67b789f86c-mtmsn" Oct 03 15:04:37 crc kubenswrapper[4774]: I1003 15:04:37.155181 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-config\") pod \"dnsmasq-dns-67b789f86c-mtmsn\" (UID: \"01cdd1f4-191a-4859-a27c-0b65d33ade12\") " pod="openstack/dnsmasq-dns-67b789f86c-mtmsn" Oct 03 15:04:37 crc kubenswrapper[4774]: I1003 15:04:37.155220 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-mtmsn\" (UID: \"01cdd1f4-191a-4859-a27c-0b65d33ade12\") " pod="openstack/dnsmasq-dns-67b789f86c-mtmsn" Oct 03 15:04:37 crc kubenswrapper[4774]: I1003 15:04:37.155324 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-dns-svc\") pod \"dnsmasq-dns-67b789f86c-mtmsn\" (UID: \"01cdd1f4-191a-4859-a27c-0b65d33ade12\") " pod="openstack/dnsmasq-dns-67b789f86c-mtmsn" Oct 03 15:04:37 crc kubenswrapper[4774]: I1003 15:04:37.156346 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-mtmsn\" (UID: \"01cdd1f4-191a-4859-a27c-0b65d33ade12\") " pod="openstack/dnsmasq-dns-67b789f86c-mtmsn" Oct 03 15:04:37 crc kubenswrapper[4774]: I1003 15:04:37.156553 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-mtmsn\" (UID: \"01cdd1f4-191a-4859-a27c-0b65d33ade12\") " pod="openstack/dnsmasq-dns-67b789f86c-mtmsn" Oct 03 15:04:37 crc kubenswrapper[4774]: I1003 15:04:37.156658 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-mtmsn\" (UID: \"01cdd1f4-191a-4859-a27c-0b65d33ade12\") " pod="openstack/dnsmasq-dns-67b789f86c-mtmsn" Oct 03 15:04:37 crc kubenswrapper[4774]: I1003 15:04:37.156763 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-mtmsn\" (UID: \"01cdd1f4-191a-4859-a27c-0b65d33ade12\") " pod="openstack/dnsmasq-dns-67b789f86c-mtmsn" Oct 03 15:04:37 crc kubenswrapper[4774]: I1003 15:04:37.156778 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-dns-svc\") pod \"dnsmasq-dns-67b789f86c-mtmsn\" (UID: \"01cdd1f4-191a-4859-a27c-0b65d33ade12\") " pod="openstack/dnsmasq-dns-67b789f86c-mtmsn" Oct 03 15:04:37 crc kubenswrapper[4774]: I1003 15:04:37.157261 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-config\") pod \"dnsmasq-dns-67b789f86c-mtmsn\" (UID: \"01cdd1f4-191a-4859-a27c-0b65d33ade12\") " pod="openstack/dnsmasq-dns-67b789f86c-mtmsn" Oct 03 15:04:37 crc kubenswrapper[4774]: I1003 15:04:37.250909 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4t6x\" (UniqueName: \"kubernetes.io/projected/01cdd1f4-191a-4859-a27c-0b65d33ade12-kube-api-access-p4t6x\") pod \"dnsmasq-dns-67b789f86c-mtmsn\" (UID: \"01cdd1f4-191a-4859-a27c-0b65d33ade12\") " pod="openstack/dnsmasq-dns-67b789f86c-mtmsn" Oct 03 15:04:37 crc kubenswrapper[4774]: I1003 15:04:37.537170 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-mtmsn" Oct 03 15:04:38 crc kubenswrapper[4774]: I1003 15:04:38.052184 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-mtmsn"] Oct 03 15:04:38 crc kubenswrapper[4774]: I1003 15:04:38.109668 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-mtmsn" event={"ID":"01cdd1f4-191a-4859-a27c-0b65d33ade12","Type":"ContainerStarted","Data":"90780a835f26ed076772800a0cf671948d30a32c83e56c069d6f58059f31afbb"} Oct 03 15:04:38 crc kubenswrapper[4774]: I1003 15:04:38.111064 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"417bcf92-1c5e-4977-a197-62b603b795a2","Type":"ContainerStarted","Data":"eab10308b172fa5c011b87b2146814087da80355ec95d9ab4b9742375077b152"} Oct 03 15:04:39 crc kubenswrapper[4774]: I1003 15:04:39.125019 4774 generic.go:334] "Generic (PLEG): container finished" podID="01cdd1f4-191a-4859-a27c-0b65d33ade12" containerID="15d9dab72c78ee548a6c63a060de73e7718761f07a33248fee0e9a88ffca49fb" exitCode=0 Oct 03 15:04:39 crc kubenswrapper[4774]: I1003 15:04:39.125224 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-mtmsn" event={"ID":"01cdd1f4-191a-4859-a27c-0b65d33ade12","Type":"ContainerDied","Data":"15d9dab72c78ee548a6c63a060de73e7718761f07a33248fee0e9a88ffca49fb"} Oct 03 15:04:40 crc kubenswrapper[4774]: I1003 15:04:40.138315 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-mtmsn" event={"ID":"01cdd1f4-191a-4859-a27c-0b65d33ade12","Type":"ContainerStarted","Data":"301385af9b00fa1203df8341711ae7bdb12250a25edf099eb9b2b72091891c58"} Oct 03 15:04:40 crc kubenswrapper[4774]: I1003 15:04:40.138870 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67b789f86c-mtmsn" Oct 03 15:04:40 crc kubenswrapper[4774]: I1003 15:04:40.181806 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67b789f86c-mtmsn" podStartSLOduration=4.181780953 podStartE2EDuration="4.181780953s" podCreationTimestamp="2025-10-03 15:04:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:04:40.176174034 +0000 UTC m=+1302.765377506" watchObservedRunningTime="2025-10-03 15:04:40.181780953 +0000 UTC m=+1302.770984405" Oct 03 15:04:47 crc kubenswrapper[4774]: I1003 15:04:47.539800 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67b789f86c-mtmsn" Oct 03 15:04:47 crc kubenswrapper[4774]: I1003 15:04:47.668810 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-wsntx"] Oct 03 15:04:47 crc kubenswrapper[4774]: I1003 15:04:47.669094 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59cf4bdb65-wsntx" podUID="12ecba83-5e7f-4bec-a930-a02540cbde61" containerName="dnsmasq-dns" containerID="cri-o://fd8f499f5e75df870fb2fcc1e32669506cbbe81ca292f4dd32d6228b8cbe4998" gracePeriod=10 Oct 03 15:04:47 crc kubenswrapper[4774]: I1003 15:04:47.854872 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-vrzcl"] Oct 03 15:04:47 crc kubenswrapper[4774]: I1003 15:04:47.857446 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb6ffcf87-vrzcl" Oct 03 15:04:47 crc kubenswrapper[4774]: I1003 15:04:47.865494 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-59cf4bdb65-wsntx" podUID="12ecba83-5e7f-4bec-a930-a02540cbde61" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.200:5353: connect: connection refused" Oct 03 15:04:47 crc kubenswrapper[4774]: I1003 15:04:47.876534 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-vrzcl"] Oct 03 15:04:47 crc kubenswrapper[4774]: I1003 15:04:47.995909 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-vrzcl\" (UID: \"6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4\") " pod="openstack/dnsmasq-dns-cb6ffcf87-vrzcl" Oct 03 15:04:47 crc kubenswrapper[4774]: I1003 15:04:47.995993 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-vrzcl\" (UID: \"6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4\") " pod="openstack/dnsmasq-dns-cb6ffcf87-vrzcl" Oct 03 15:04:47 crc kubenswrapper[4774]: I1003 15:04:47.996154 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4-config\") pod \"dnsmasq-dns-cb6ffcf87-vrzcl\" (UID: \"6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4\") " pod="openstack/dnsmasq-dns-cb6ffcf87-vrzcl" Oct 03 15:04:47 crc kubenswrapper[4774]: I1003 15:04:47.996233 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-vrzcl\" (UID: \"6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4\") " pod="openstack/dnsmasq-dns-cb6ffcf87-vrzcl" Oct 03 15:04:47 crc kubenswrapper[4774]: I1003 15:04:47.996525 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdw67\" (UniqueName: \"kubernetes.io/projected/6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4-kube-api-access-sdw67\") pod \"dnsmasq-dns-cb6ffcf87-vrzcl\" (UID: \"6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4\") " pod="openstack/dnsmasq-dns-cb6ffcf87-vrzcl" Oct 03 15:04:47 crc kubenswrapper[4774]: I1003 15:04:47.996581 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-vrzcl\" (UID: \"6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4\") " pod="openstack/dnsmasq-dns-cb6ffcf87-vrzcl" Oct 03 15:04:47 crc kubenswrapper[4774]: I1003 15:04:47.996609 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-vrzcl\" (UID: \"6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4\") " pod="openstack/dnsmasq-dns-cb6ffcf87-vrzcl" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.098519 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdw67\" (UniqueName: \"kubernetes.io/projected/6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4-kube-api-access-sdw67\") pod \"dnsmasq-dns-cb6ffcf87-vrzcl\" (UID: \"6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4\") " pod="openstack/dnsmasq-dns-cb6ffcf87-vrzcl" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.098582 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-vrzcl\" (UID: \"6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4\") " pod="openstack/dnsmasq-dns-cb6ffcf87-vrzcl" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.098610 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-vrzcl\" (UID: \"6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4\") " pod="openstack/dnsmasq-dns-cb6ffcf87-vrzcl" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.098648 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-vrzcl\" (UID: \"6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4\") " pod="openstack/dnsmasq-dns-cb6ffcf87-vrzcl" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.098673 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-vrzcl\" (UID: \"6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4\") " pod="openstack/dnsmasq-dns-cb6ffcf87-vrzcl" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.098734 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4-config\") pod \"dnsmasq-dns-cb6ffcf87-vrzcl\" (UID: \"6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4\") " pod="openstack/dnsmasq-dns-cb6ffcf87-vrzcl" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.098758 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-vrzcl\" (UID: \"6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4\") " pod="openstack/dnsmasq-dns-cb6ffcf87-vrzcl" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.099896 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-vrzcl\" (UID: \"6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4\") " pod="openstack/dnsmasq-dns-cb6ffcf87-vrzcl" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.099943 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-vrzcl\" (UID: \"6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4\") " pod="openstack/dnsmasq-dns-cb6ffcf87-vrzcl" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.100748 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-vrzcl\" (UID: \"6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4\") " pod="openstack/dnsmasq-dns-cb6ffcf87-vrzcl" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.101386 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-vrzcl\" (UID: \"6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4\") " pod="openstack/dnsmasq-dns-cb6ffcf87-vrzcl" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.101805 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4-config\") pod \"dnsmasq-dns-cb6ffcf87-vrzcl\" (UID: \"6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4\") " pod="openstack/dnsmasq-dns-cb6ffcf87-vrzcl" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.112010 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-vrzcl\" (UID: \"6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4\") " pod="openstack/dnsmasq-dns-cb6ffcf87-vrzcl" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.122812 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdw67\" (UniqueName: \"kubernetes.io/projected/6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4-kube-api-access-sdw67\") pod \"dnsmasq-dns-cb6ffcf87-vrzcl\" (UID: \"6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4\") " pod="openstack/dnsmasq-dns-cb6ffcf87-vrzcl" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.197061 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-wsntx" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.210463 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb6ffcf87-vrzcl" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.247167 4774 generic.go:334] "Generic (PLEG): container finished" podID="12ecba83-5e7f-4bec-a930-a02540cbde61" containerID="fd8f499f5e75df870fb2fcc1e32669506cbbe81ca292f4dd32d6228b8cbe4998" exitCode=0 Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.247246 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-wsntx" event={"ID":"12ecba83-5e7f-4bec-a930-a02540cbde61","Type":"ContainerDied","Data":"fd8f499f5e75df870fb2fcc1e32669506cbbe81ca292f4dd32d6228b8cbe4998"} Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.247276 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-wsntx" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.247299 4774 scope.go:117] "RemoveContainer" containerID="fd8f499f5e75df870fb2fcc1e32669506cbbe81ca292f4dd32d6228b8cbe4998" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.247276 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-wsntx" event={"ID":"12ecba83-5e7f-4bec-a930-a02540cbde61","Type":"ContainerDied","Data":"98ed8c910d82f0bb860453accdb94a4bb2537970d93be35c61190584082dcecb"} Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.308513 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxqqw\" (UniqueName: \"kubernetes.io/projected/12ecba83-5e7f-4bec-a930-a02540cbde61-kube-api-access-sxqqw\") pod \"12ecba83-5e7f-4bec-a930-a02540cbde61\" (UID: \"12ecba83-5e7f-4bec-a930-a02540cbde61\") " Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.308657 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12ecba83-5e7f-4bec-a930-a02540cbde61-ovsdbserver-sb\") pod \"12ecba83-5e7f-4bec-a930-a02540cbde61\" (UID: \"12ecba83-5e7f-4bec-a930-a02540cbde61\") " Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.308687 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12ecba83-5e7f-4bec-a930-a02540cbde61-ovsdbserver-nb\") pod \"12ecba83-5e7f-4bec-a930-a02540cbde61\" (UID: \"12ecba83-5e7f-4bec-a930-a02540cbde61\") " Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.308727 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12ecba83-5e7f-4bec-a930-a02540cbde61-dns-svc\") pod \"12ecba83-5e7f-4bec-a930-a02540cbde61\" (UID: \"12ecba83-5e7f-4bec-a930-a02540cbde61\") " Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.308789 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/12ecba83-5e7f-4bec-a930-a02540cbde61-dns-swift-storage-0\") pod \"12ecba83-5e7f-4bec-a930-a02540cbde61\" (UID: \"12ecba83-5e7f-4bec-a930-a02540cbde61\") " Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.308858 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12ecba83-5e7f-4bec-a930-a02540cbde61-config\") pod \"12ecba83-5e7f-4bec-a930-a02540cbde61\" (UID: \"12ecba83-5e7f-4bec-a930-a02540cbde61\") " Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.323623 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12ecba83-5e7f-4bec-a930-a02540cbde61-kube-api-access-sxqqw" (OuterVolumeSpecName: "kube-api-access-sxqqw") pod "12ecba83-5e7f-4bec-a930-a02540cbde61" (UID: "12ecba83-5e7f-4bec-a930-a02540cbde61"). InnerVolumeSpecName "kube-api-access-sxqqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.393688 4774 scope.go:117] "RemoveContainer" containerID="77467c335237a1ec1f212a7b6867d099fc810f7c9b82767748f820e81d36e19d" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.394604 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12ecba83-5e7f-4bec-a930-a02540cbde61-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "12ecba83-5e7f-4bec-a930-a02540cbde61" (UID: "12ecba83-5e7f-4bec-a930-a02540cbde61"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.394677 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12ecba83-5e7f-4bec-a930-a02540cbde61-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "12ecba83-5e7f-4bec-a930-a02540cbde61" (UID: "12ecba83-5e7f-4bec-a930-a02540cbde61"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.400000 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12ecba83-5e7f-4bec-a930-a02540cbde61-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "12ecba83-5e7f-4bec-a930-a02540cbde61" (UID: "12ecba83-5e7f-4bec-a930-a02540cbde61"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.400291 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12ecba83-5e7f-4bec-a930-a02540cbde61-config" (OuterVolumeSpecName: "config") pod "12ecba83-5e7f-4bec-a930-a02540cbde61" (UID: "12ecba83-5e7f-4bec-a930-a02540cbde61"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.411495 4774 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12ecba83-5e7f-4bec-a930-a02540cbde61-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.411795 4774 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12ecba83-5e7f-4bec-a930-a02540cbde61-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.411805 4774 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/12ecba83-5e7f-4bec-a930-a02540cbde61-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.411816 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12ecba83-5e7f-4bec-a930-a02540cbde61-config\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.411828 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxqqw\" (UniqueName: \"kubernetes.io/projected/12ecba83-5e7f-4bec-a930-a02540cbde61-kube-api-access-sxqqw\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.441131 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12ecba83-5e7f-4bec-a930-a02540cbde61-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "12ecba83-5e7f-4bec-a930-a02540cbde61" (UID: "12ecba83-5e7f-4bec-a930-a02540cbde61"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.451852 4774 scope.go:117] "RemoveContainer" containerID="fd8f499f5e75df870fb2fcc1e32669506cbbe81ca292f4dd32d6228b8cbe4998" Oct 03 15:04:48 crc kubenswrapper[4774]: E1003 15:04:48.452178 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd8f499f5e75df870fb2fcc1e32669506cbbe81ca292f4dd32d6228b8cbe4998\": container with ID starting with fd8f499f5e75df870fb2fcc1e32669506cbbe81ca292f4dd32d6228b8cbe4998 not found: ID does not exist" containerID="fd8f499f5e75df870fb2fcc1e32669506cbbe81ca292f4dd32d6228b8cbe4998" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.452214 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd8f499f5e75df870fb2fcc1e32669506cbbe81ca292f4dd32d6228b8cbe4998"} err="failed to get container status \"fd8f499f5e75df870fb2fcc1e32669506cbbe81ca292f4dd32d6228b8cbe4998\": rpc error: code = NotFound desc = could not find container \"fd8f499f5e75df870fb2fcc1e32669506cbbe81ca292f4dd32d6228b8cbe4998\": container with ID starting with fd8f499f5e75df870fb2fcc1e32669506cbbe81ca292f4dd32d6228b8cbe4998 not found: ID does not exist" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.452241 4774 scope.go:117] "RemoveContainer" containerID="77467c335237a1ec1f212a7b6867d099fc810f7c9b82767748f820e81d36e19d" Oct 03 15:04:48 crc kubenswrapper[4774]: E1003 15:04:48.452645 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77467c335237a1ec1f212a7b6867d099fc810f7c9b82767748f820e81d36e19d\": container with ID starting with 77467c335237a1ec1f212a7b6867d099fc810f7c9b82767748f820e81d36e19d not found: ID does not exist" containerID="77467c335237a1ec1f212a7b6867d099fc810f7c9b82767748f820e81d36e19d" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.452672 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77467c335237a1ec1f212a7b6867d099fc810f7c9b82767748f820e81d36e19d"} err="failed to get container status \"77467c335237a1ec1f212a7b6867d099fc810f7c9b82767748f820e81d36e19d\": rpc error: code = NotFound desc = could not find container \"77467c335237a1ec1f212a7b6867d099fc810f7c9b82767748f820e81d36e19d\": container with ID starting with 77467c335237a1ec1f212a7b6867d099fc810f7c9b82767748f820e81d36e19d not found: ID does not exist" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.515350 4774 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12ecba83-5e7f-4bec-a930-a02540cbde61-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.583875 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-wsntx"] Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.591477 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-wsntx"] Oct 03 15:04:48 crc kubenswrapper[4774]: I1003 15:04:48.720661 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-vrzcl"] Oct 03 15:04:48 crc kubenswrapper[4774]: W1003 15:04:48.721215 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fd3f029_cab3_4e3f_ab8e_8f156c6bdaf4.slice/crio-13b780417b8aa093ea2173e82c19c3c853ab0e5fccf2a6e3bcda808cf7fa8d75 WatchSource:0}: Error finding container 13b780417b8aa093ea2173e82c19c3c853ab0e5fccf2a6e3bcda808cf7fa8d75: Status 404 returned error can't find the container with id 13b780417b8aa093ea2173e82c19c3c853ab0e5fccf2a6e3bcda808cf7fa8d75 Oct 03 15:04:49 crc kubenswrapper[4774]: I1003 15:04:49.259963 4774 generic.go:334] "Generic (PLEG): container finished" podID="6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4" containerID="a43c7e764b968cbc12995d2216978eb64152394d1b3e2a23224d7c082c712070" exitCode=0 Oct 03 15:04:49 crc kubenswrapper[4774]: I1003 15:04:49.260017 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-vrzcl" event={"ID":"6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4","Type":"ContainerDied","Data":"a43c7e764b968cbc12995d2216978eb64152394d1b3e2a23224d7c082c712070"} Oct 03 15:04:49 crc kubenswrapper[4774]: I1003 15:04:49.260317 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-vrzcl" event={"ID":"6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4","Type":"ContainerStarted","Data":"13b780417b8aa093ea2173e82c19c3c853ab0e5fccf2a6e3bcda808cf7fa8d75"} Oct 03 15:04:49 crc kubenswrapper[4774]: I1003 15:04:49.311087 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12ecba83-5e7f-4bec-a930-a02540cbde61" path="/var/lib/kubelet/pods/12ecba83-5e7f-4bec-a930-a02540cbde61/volumes" Oct 03 15:04:50 crc kubenswrapper[4774]: I1003 15:04:50.273660 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-vrzcl" event={"ID":"6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4","Type":"ContainerStarted","Data":"80d747e8c2f2ac4acf64b75bceeee836b169f0961af070273c3e8499dc465a71"} Oct 03 15:04:50 crc kubenswrapper[4774]: I1003 15:04:50.274193 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cb6ffcf87-vrzcl" Oct 03 15:04:50 crc kubenswrapper[4774]: I1003 15:04:50.299966 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cb6ffcf87-vrzcl" podStartSLOduration=3.2999517259999998 podStartE2EDuration="3.299951726s" podCreationTimestamp="2025-10-03 15:04:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:04:50.29728323 +0000 UTC m=+1312.886486682" watchObservedRunningTime="2025-10-03 15:04:50.299951726 +0000 UTC m=+1312.889155178" Oct 03 15:04:58 crc kubenswrapper[4774]: I1003 15:04:58.212643 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cb6ffcf87-vrzcl" Oct 03 15:04:58 crc kubenswrapper[4774]: I1003 15:04:58.311521 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-mtmsn"] Oct 03 15:04:58 crc kubenswrapper[4774]: I1003 15:04:58.311780 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67b789f86c-mtmsn" podUID="01cdd1f4-191a-4859-a27c-0b65d33ade12" containerName="dnsmasq-dns" containerID="cri-o://301385af9b00fa1203df8341711ae7bdb12250a25edf099eb9b2b72091891c58" gracePeriod=10 Oct 03 15:04:58 crc kubenswrapper[4774]: I1003 15:04:58.803513 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-mtmsn" Oct 03 15:04:58 crc kubenswrapper[4774]: I1003 15:04:58.858925 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-dns-svc\") pod \"01cdd1f4-191a-4859-a27c-0b65d33ade12\" (UID: \"01cdd1f4-191a-4859-a27c-0b65d33ade12\") " Oct 03 15:04:58 crc kubenswrapper[4774]: I1003 15:04:58.859092 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-config\") pod \"01cdd1f4-191a-4859-a27c-0b65d33ade12\" (UID: \"01cdd1f4-191a-4859-a27c-0b65d33ade12\") " Oct 03 15:04:58 crc kubenswrapper[4774]: I1003 15:04:58.859126 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-ovsdbserver-nb\") pod \"01cdd1f4-191a-4859-a27c-0b65d33ade12\" (UID: \"01cdd1f4-191a-4859-a27c-0b65d33ade12\") " Oct 03 15:04:58 crc kubenswrapper[4774]: I1003 15:04:58.859172 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-dns-swift-storage-0\") pod \"01cdd1f4-191a-4859-a27c-0b65d33ade12\" (UID: \"01cdd1f4-191a-4859-a27c-0b65d33ade12\") " Oct 03 15:04:58 crc kubenswrapper[4774]: I1003 15:04:58.859236 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-ovsdbserver-sb\") pod \"01cdd1f4-191a-4859-a27c-0b65d33ade12\" (UID: \"01cdd1f4-191a-4859-a27c-0b65d33ade12\") " Oct 03 15:04:58 crc kubenswrapper[4774]: I1003 15:04:58.859269 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-openstack-edpm-ipam\") pod \"01cdd1f4-191a-4859-a27c-0b65d33ade12\" (UID: \"01cdd1f4-191a-4859-a27c-0b65d33ade12\") " Oct 03 15:04:58 crc kubenswrapper[4774]: I1003 15:04:58.859303 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4t6x\" (UniqueName: \"kubernetes.io/projected/01cdd1f4-191a-4859-a27c-0b65d33ade12-kube-api-access-p4t6x\") pod \"01cdd1f4-191a-4859-a27c-0b65d33ade12\" (UID: \"01cdd1f4-191a-4859-a27c-0b65d33ade12\") " Oct 03 15:04:58 crc kubenswrapper[4774]: I1003 15:04:58.866914 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01cdd1f4-191a-4859-a27c-0b65d33ade12-kube-api-access-p4t6x" (OuterVolumeSpecName: "kube-api-access-p4t6x") pod "01cdd1f4-191a-4859-a27c-0b65d33ade12" (UID: "01cdd1f4-191a-4859-a27c-0b65d33ade12"). InnerVolumeSpecName "kube-api-access-p4t6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:04:58 crc kubenswrapper[4774]: I1003 15:04:58.914206 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "01cdd1f4-191a-4859-a27c-0b65d33ade12" (UID: "01cdd1f4-191a-4859-a27c-0b65d33ade12"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:04:58 crc kubenswrapper[4774]: I1003 15:04:58.916748 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "01cdd1f4-191a-4859-a27c-0b65d33ade12" (UID: "01cdd1f4-191a-4859-a27c-0b65d33ade12"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:04:58 crc kubenswrapper[4774]: I1003 15:04:58.920917 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "01cdd1f4-191a-4859-a27c-0b65d33ade12" (UID: "01cdd1f4-191a-4859-a27c-0b65d33ade12"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:04:58 crc kubenswrapper[4774]: I1003 15:04:58.922503 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-config" (OuterVolumeSpecName: "config") pod "01cdd1f4-191a-4859-a27c-0b65d33ade12" (UID: "01cdd1f4-191a-4859-a27c-0b65d33ade12"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:04:58 crc kubenswrapper[4774]: I1003 15:04:58.927031 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "01cdd1f4-191a-4859-a27c-0b65d33ade12" (UID: "01cdd1f4-191a-4859-a27c-0b65d33ade12"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:04:58 crc kubenswrapper[4774]: I1003 15:04:58.930996 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "01cdd1f4-191a-4859-a27c-0b65d33ade12" (UID: "01cdd1f4-191a-4859-a27c-0b65d33ade12"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:04:58 crc kubenswrapper[4774]: I1003 15:04:58.961517 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-config\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:58 crc kubenswrapper[4774]: I1003 15:04:58.961555 4774 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:58 crc kubenswrapper[4774]: I1003 15:04:58.961568 4774 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:58 crc kubenswrapper[4774]: I1003 15:04:58.961577 4774 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:58 crc kubenswrapper[4774]: I1003 15:04:58.961591 4774 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:58 crc kubenswrapper[4774]: I1003 15:04:58.961599 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4t6x\" (UniqueName: \"kubernetes.io/projected/01cdd1f4-191a-4859-a27c-0b65d33ade12-kube-api-access-p4t6x\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:58 crc kubenswrapper[4774]: I1003 15:04:58.961609 4774 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01cdd1f4-191a-4859-a27c-0b65d33ade12-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:59 crc kubenswrapper[4774]: I1003 15:04:59.373190 4774 generic.go:334] "Generic (PLEG): container finished" podID="01cdd1f4-191a-4859-a27c-0b65d33ade12" containerID="301385af9b00fa1203df8341711ae7bdb12250a25edf099eb9b2b72091891c58" exitCode=0 Oct 03 15:04:59 crc kubenswrapper[4774]: I1003 15:04:59.373249 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-mtmsn" event={"ID":"01cdd1f4-191a-4859-a27c-0b65d33ade12","Type":"ContainerDied","Data":"301385af9b00fa1203df8341711ae7bdb12250a25edf099eb9b2b72091891c58"} Oct 03 15:04:59 crc kubenswrapper[4774]: I1003 15:04:59.373284 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-mtmsn" event={"ID":"01cdd1f4-191a-4859-a27c-0b65d33ade12","Type":"ContainerDied","Data":"90780a835f26ed076772800a0cf671948d30a32c83e56c069d6f58059f31afbb"} Oct 03 15:04:59 crc kubenswrapper[4774]: I1003 15:04:59.373298 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-mtmsn" Oct 03 15:04:59 crc kubenswrapper[4774]: I1003 15:04:59.373314 4774 scope.go:117] "RemoveContainer" containerID="301385af9b00fa1203df8341711ae7bdb12250a25edf099eb9b2b72091891c58" Oct 03 15:04:59 crc kubenswrapper[4774]: I1003 15:04:59.400576 4774 scope.go:117] "RemoveContainer" containerID="15d9dab72c78ee548a6c63a060de73e7718761f07a33248fee0e9a88ffca49fb" Oct 03 15:04:59 crc kubenswrapper[4774]: I1003 15:04:59.404528 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-mtmsn"] Oct 03 15:04:59 crc kubenswrapper[4774]: I1003 15:04:59.412802 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-mtmsn"] Oct 03 15:04:59 crc kubenswrapper[4774]: I1003 15:04:59.420742 4774 scope.go:117] "RemoveContainer" containerID="301385af9b00fa1203df8341711ae7bdb12250a25edf099eb9b2b72091891c58" Oct 03 15:04:59 crc kubenswrapper[4774]: E1003 15:04:59.421158 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"301385af9b00fa1203df8341711ae7bdb12250a25edf099eb9b2b72091891c58\": container with ID starting with 301385af9b00fa1203df8341711ae7bdb12250a25edf099eb9b2b72091891c58 not found: ID does not exist" containerID="301385af9b00fa1203df8341711ae7bdb12250a25edf099eb9b2b72091891c58" Oct 03 15:04:59 crc kubenswrapper[4774]: I1003 15:04:59.421199 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"301385af9b00fa1203df8341711ae7bdb12250a25edf099eb9b2b72091891c58"} err="failed to get container status \"301385af9b00fa1203df8341711ae7bdb12250a25edf099eb9b2b72091891c58\": rpc error: code = NotFound desc = could not find container \"301385af9b00fa1203df8341711ae7bdb12250a25edf099eb9b2b72091891c58\": container with ID starting with 301385af9b00fa1203df8341711ae7bdb12250a25edf099eb9b2b72091891c58 not found: ID does not exist" Oct 03 15:04:59 crc kubenswrapper[4774]: I1003 15:04:59.421231 4774 scope.go:117] "RemoveContainer" containerID="15d9dab72c78ee548a6c63a060de73e7718761f07a33248fee0e9a88ffca49fb" Oct 03 15:04:59 crc kubenswrapper[4774]: E1003 15:04:59.421566 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15d9dab72c78ee548a6c63a060de73e7718761f07a33248fee0e9a88ffca49fb\": container with ID starting with 15d9dab72c78ee548a6c63a060de73e7718761f07a33248fee0e9a88ffca49fb not found: ID does not exist" containerID="15d9dab72c78ee548a6c63a060de73e7718761f07a33248fee0e9a88ffca49fb" Oct 03 15:04:59 crc kubenswrapper[4774]: I1003 15:04:59.421604 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15d9dab72c78ee548a6c63a060de73e7718761f07a33248fee0e9a88ffca49fb"} err="failed to get container status \"15d9dab72c78ee548a6c63a060de73e7718761f07a33248fee0e9a88ffca49fb\": rpc error: code = NotFound desc = could not find container \"15d9dab72c78ee548a6c63a060de73e7718761f07a33248fee0e9a88ffca49fb\": container with ID starting with 15d9dab72c78ee548a6c63a060de73e7718761f07a33248fee0e9a88ffca49fb not found: ID does not exist" Oct 03 15:05:01 crc kubenswrapper[4774]: I1003 15:05:01.313110 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01cdd1f4-191a-4859-a27c-0b65d33ade12" path="/var/lib/kubelet/pods/01cdd1f4-191a-4859-a27c-0b65d33ade12/volumes" Oct 03 15:05:08 crc kubenswrapper[4774]: I1003 15:05:08.467465 4774 generic.go:334] "Generic (PLEG): container finished" podID="6d6cfa86-4356-4d79-9edd-977355592186" containerID="24ed5b3bb865fde0246fcf5b8f4b83daa696a05b41c04220d3fe3beb29b17384" exitCode=0 Oct 03 15:05:08 crc kubenswrapper[4774]: I1003 15:05:08.467534 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6d6cfa86-4356-4d79-9edd-977355592186","Type":"ContainerDied","Data":"24ed5b3bb865fde0246fcf5b8f4b83daa696a05b41c04220d3fe3beb29b17384"} Oct 03 15:05:09 crc kubenswrapper[4774]: I1003 15:05:09.481572 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6d6cfa86-4356-4d79-9edd-977355592186","Type":"ContainerStarted","Data":"714f5c7a0e705f93cdb13dfbe4df96360d889a0133639922e89e940b7065ae29"} Oct 03 15:05:09 crc kubenswrapper[4774]: I1003 15:05:09.483303 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 03 15:05:09 crc kubenswrapper[4774]: I1003 15:05:09.508874 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.508817001 podStartE2EDuration="36.508817001s" podCreationTimestamp="2025-10-03 15:04:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:05:09.504225337 +0000 UTC m=+1332.093428799" watchObservedRunningTime="2025-10-03 15:05:09.508817001 +0000 UTC m=+1332.098020453" Oct 03 15:05:10 crc kubenswrapper[4774]: I1003 15:05:10.491444 4774 generic.go:334] "Generic (PLEG): container finished" podID="417bcf92-1c5e-4977-a197-62b603b795a2" containerID="eab10308b172fa5c011b87b2146814087da80355ec95d9ab4b9742375077b152" exitCode=0 Oct 03 15:05:10 crc kubenswrapper[4774]: I1003 15:05:10.491544 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"417bcf92-1c5e-4977-a197-62b603b795a2","Type":"ContainerDied","Data":"eab10308b172fa5c011b87b2146814087da80355ec95d9ab4b9742375077b152"} Oct 03 15:05:10 crc kubenswrapper[4774]: I1003 15:05:10.984004 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh"] Oct 03 15:05:10 crc kubenswrapper[4774]: E1003 15:05:10.984944 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12ecba83-5e7f-4bec-a930-a02540cbde61" containerName="dnsmasq-dns" Oct 03 15:05:10 crc kubenswrapper[4774]: I1003 15:05:10.984967 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ecba83-5e7f-4bec-a930-a02540cbde61" containerName="dnsmasq-dns" Oct 03 15:05:10 crc kubenswrapper[4774]: E1003 15:05:10.984981 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01cdd1f4-191a-4859-a27c-0b65d33ade12" containerName="init" Oct 03 15:05:10 crc kubenswrapper[4774]: I1003 15:05:10.984989 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="01cdd1f4-191a-4859-a27c-0b65d33ade12" containerName="init" Oct 03 15:05:10 crc kubenswrapper[4774]: E1003 15:05:10.985010 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12ecba83-5e7f-4bec-a930-a02540cbde61" containerName="init" Oct 03 15:05:10 crc kubenswrapper[4774]: I1003 15:05:10.985017 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ecba83-5e7f-4bec-a930-a02540cbde61" containerName="init" Oct 03 15:05:10 crc kubenswrapper[4774]: E1003 15:05:10.985044 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01cdd1f4-191a-4859-a27c-0b65d33ade12" containerName="dnsmasq-dns" Oct 03 15:05:10 crc kubenswrapper[4774]: I1003 15:05:10.985049 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="01cdd1f4-191a-4859-a27c-0b65d33ade12" containerName="dnsmasq-dns" Oct 03 15:05:10 crc kubenswrapper[4774]: I1003 15:05:10.985220 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="12ecba83-5e7f-4bec-a930-a02540cbde61" containerName="dnsmasq-dns" Oct 03 15:05:10 crc kubenswrapper[4774]: I1003 15:05:10.985240 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="01cdd1f4-191a-4859-a27c-0b65d33ade12" containerName="dnsmasq-dns" Oct 03 15:05:10 crc kubenswrapper[4774]: I1003 15:05:10.985931 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh" Oct 03 15:05:10 crc kubenswrapper[4774]: I1003 15:05:10.989709 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 15:05:10 crc kubenswrapper[4774]: I1003 15:05:10.989800 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7bdzq" Oct 03 15:05:10 crc kubenswrapper[4774]: I1003 15:05:10.989714 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 15:05:10 crc kubenswrapper[4774]: I1003 15:05:10.991198 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 15:05:10 crc kubenswrapper[4774]: I1003 15:05:10.999322 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh"] Oct 03 15:05:11 crc kubenswrapper[4774]: I1003 15:05:11.101705 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/136d617b-f485-4841-b6e2-350b591cd22e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh\" (UID: \"136d617b-f485-4841-b6e2-350b591cd22e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh" Oct 03 15:05:11 crc kubenswrapper[4774]: I1003 15:05:11.102068 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/136d617b-f485-4841-b6e2-350b591cd22e-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh\" (UID: \"136d617b-f485-4841-b6e2-350b591cd22e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh" Oct 03 15:05:11 crc kubenswrapper[4774]: I1003 15:05:11.102248 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/136d617b-f485-4841-b6e2-350b591cd22e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh\" (UID: \"136d617b-f485-4841-b6e2-350b591cd22e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh" Oct 03 15:05:11 crc kubenswrapper[4774]: I1003 15:05:11.102334 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln42r\" (UniqueName: \"kubernetes.io/projected/136d617b-f485-4841-b6e2-350b591cd22e-kube-api-access-ln42r\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh\" (UID: \"136d617b-f485-4841-b6e2-350b591cd22e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh" Oct 03 15:05:11 crc kubenswrapper[4774]: I1003 15:05:11.204414 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/136d617b-f485-4841-b6e2-350b591cd22e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh\" (UID: \"136d617b-f485-4841-b6e2-350b591cd22e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh" Oct 03 15:05:11 crc kubenswrapper[4774]: I1003 15:05:11.204754 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln42r\" (UniqueName: \"kubernetes.io/projected/136d617b-f485-4841-b6e2-350b591cd22e-kube-api-access-ln42r\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh\" (UID: \"136d617b-f485-4841-b6e2-350b591cd22e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh" Oct 03 15:05:11 crc kubenswrapper[4774]: I1003 15:05:11.204979 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/136d617b-f485-4841-b6e2-350b591cd22e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh\" (UID: \"136d617b-f485-4841-b6e2-350b591cd22e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh" Oct 03 15:05:11 crc kubenswrapper[4774]: I1003 15:05:11.205170 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/136d617b-f485-4841-b6e2-350b591cd22e-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh\" (UID: \"136d617b-f485-4841-b6e2-350b591cd22e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh" Oct 03 15:05:11 crc kubenswrapper[4774]: I1003 15:05:11.211297 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/136d617b-f485-4841-b6e2-350b591cd22e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh\" (UID: \"136d617b-f485-4841-b6e2-350b591cd22e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh" Oct 03 15:05:11 crc kubenswrapper[4774]: I1003 15:05:11.212268 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/136d617b-f485-4841-b6e2-350b591cd22e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh\" (UID: \"136d617b-f485-4841-b6e2-350b591cd22e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh" Oct 03 15:05:11 crc kubenswrapper[4774]: I1003 15:05:11.222060 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/136d617b-f485-4841-b6e2-350b591cd22e-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh\" (UID: \"136d617b-f485-4841-b6e2-350b591cd22e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh" Oct 03 15:05:11 crc kubenswrapper[4774]: I1003 15:05:11.226773 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln42r\" (UniqueName: \"kubernetes.io/projected/136d617b-f485-4841-b6e2-350b591cd22e-kube-api-access-ln42r\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh\" (UID: \"136d617b-f485-4841-b6e2-350b591cd22e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh" Oct 03 15:05:11 crc kubenswrapper[4774]: I1003 15:05:11.301491 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh" Oct 03 15:05:11 crc kubenswrapper[4774]: I1003 15:05:11.511266 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"417bcf92-1c5e-4977-a197-62b603b795a2","Type":"ContainerStarted","Data":"e9a5545d9aae39d780c9d2e7fa0771fb7b74f792b1382f269b87005da136bdbe"} Oct 03 15:05:11 crc kubenswrapper[4774]: I1003 15:05:11.511968 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:05:11 crc kubenswrapper[4774]: I1003 15:05:11.543552 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.543527292 podStartE2EDuration="36.543527292s" podCreationTimestamp="2025-10-03 15:04:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:05:11.534767655 +0000 UTC m=+1334.123971117" watchObservedRunningTime="2025-10-03 15:05:11.543527292 +0000 UTC m=+1334.132730734" Oct 03 15:05:11 crc kubenswrapper[4774]: I1003 15:05:11.811763 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh"] Oct 03 15:05:11 crc kubenswrapper[4774]: I1003 15:05:11.833951 4774 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 15:05:12 crc kubenswrapper[4774]: I1003 15:05:12.529620 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh" event={"ID":"136d617b-f485-4841-b6e2-350b591cd22e","Type":"ContainerStarted","Data":"d38b51ee166811c6507d75941cc05f30bc38e1e50113c21b938e6741ef1aa7d9"} Oct 03 15:05:20 crc kubenswrapper[4774]: I1003 15:05:20.654035 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:05:20 crc kubenswrapper[4774]: I1003 15:05:20.654802 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:05:21 crc kubenswrapper[4774]: I1003 15:05:21.630352 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh" event={"ID":"136d617b-f485-4841-b6e2-350b591cd22e","Type":"ContainerStarted","Data":"e90c0b1b5d62bdf39b6ac46a13aeb74476c30fcb4b44aae36e4b79ac3f331792"} Oct 03 15:05:21 crc kubenswrapper[4774]: I1003 15:05:21.655477 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh" podStartSLOduration=2.330744413 podStartE2EDuration="11.655458862s" podCreationTimestamp="2025-10-03 15:05:10 +0000 UTC" firstStartedPulling="2025-10-03 15:05:11.833594901 +0000 UTC m=+1334.422798353" lastFinishedPulling="2025-10-03 15:05:21.15830932 +0000 UTC m=+1343.747512802" observedRunningTime="2025-10-03 15:05:21.652223722 +0000 UTC m=+1344.241427224" watchObservedRunningTime="2025-10-03 15:05:21.655458862 +0000 UTC m=+1344.244662314" Oct 03 15:05:23 crc kubenswrapper[4774]: I1003 15:05:23.491672 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 03 15:05:25 crc kubenswrapper[4774]: I1003 15:05:25.579507 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 03 15:05:32 crc kubenswrapper[4774]: I1003 15:05:32.763546 4774 generic.go:334] "Generic (PLEG): container finished" podID="136d617b-f485-4841-b6e2-350b591cd22e" containerID="e90c0b1b5d62bdf39b6ac46a13aeb74476c30fcb4b44aae36e4b79ac3f331792" exitCode=0 Oct 03 15:05:32 crc kubenswrapper[4774]: I1003 15:05:32.763657 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh" event={"ID":"136d617b-f485-4841-b6e2-350b591cd22e","Type":"ContainerDied","Data":"e90c0b1b5d62bdf39b6ac46a13aeb74476c30fcb4b44aae36e4b79ac3f331792"} Oct 03 15:05:34 crc kubenswrapper[4774]: I1003 15:05:34.215760 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh" Oct 03 15:05:34 crc kubenswrapper[4774]: I1003 15:05:34.293000 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/136d617b-f485-4841-b6e2-350b591cd22e-ssh-key\") pod \"136d617b-f485-4841-b6e2-350b591cd22e\" (UID: \"136d617b-f485-4841-b6e2-350b591cd22e\") " Oct 03 15:05:34 crc kubenswrapper[4774]: I1003 15:05:34.293105 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/136d617b-f485-4841-b6e2-350b591cd22e-inventory\") pod \"136d617b-f485-4841-b6e2-350b591cd22e\" (UID: \"136d617b-f485-4841-b6e2-350b591cd22e\") " Oct 03 15:05:34 crc kubenswrapper[4774]: I1003 15:05:34.293128 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/136d617b-f485-4841-b6e2-350b591cd22e-repo-setup-combined-ca-bundle\") pod \"136d617b-f485-4841-b6e2-350b591cd22e\" (UID: \"136d617b-f485-4841-b6e2-350b591cd22e\") " Oct 03 15:05:34 crc kubenswrapper[4774]: I1003 15:05:34.293235 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ln42r\" (UniqueName: \"kubernetes.io/projected/136d617b-f485-4841-b6e2-350b591cd22e-kube-api-access-ln42r\") pod \"136d617b-f485-4841-b6e2-350b591cd22e\" (UID: \"136d617b-f485-4841-b6e2-350b591cd22e\") " Oct 03 15:05:34 crc kubenswrapper[4774]: I1003 15:05:34.299338 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/136d617b-f485-4841-b6e2-350b591cd22e-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "136d617b-f485-4841-b6e2-350b591cd22e" (UID: "136d617b-f485-4841-b6e2-350b591cd22e"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:05:34 crc kubenswrapper[4774]: I1003 15:05:34.301923 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/136d617b-f485-4841-b6e2-350b591cd22e-kube-api-access-ln42r" (OuterVolumeSpecName: "kube-api-access-ln42r") pod "136d617b-f485-4841-b6e2-350b591cd22e" (UID: "136d617b-f485-4841-b6e2-350b591cd22e"). InnerVolumeSpecName "kube-api-access-ln42r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:05:34 crc kubenswrapper[4774]: I1003 15:05:34.322558 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/136d617b-f485-4841-b6e2-350b591cd22e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "136d617b-f485-4841-b6e2-350b591cd22e" (UID: "136d617b-f485-4841-b6e2-350b591cd22e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:05:34 crc kubenswrapper[4774]: I1003 15:05:34.328677 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/136d617b-f485-4841-b6e2-350b591cd22e-inventory" (OuterVolumeSpecName: "inventory") pod "136d617b-f485-4841-b6e2-350b591cd22e" (UID: "136d617b-f485-4841-b6e2-350b591cd22e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:05:34 crc kubenswrapper[4774]: I1003 15:05:34.395600 4774 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/136d617b-f485-4841-b6e2-350b591cd22e-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 15:05:34 crc kubenswrapper[4774]: I1003 15:05:34.395629 4774 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/136d617b-f485-4841-b6e2-350b591cd22e-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:05:34 crc kubenswrapper[4774]: I1003 15:05:34.395640 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ln42r\" (UniqueName: \"kubernetes.io/projected/136d617b-f485-4841-b6e2-350b591cd22e-kube-api-access-ln42r\") on node \"crc\" DevicePath \"\"" Oct 03 15:05:34 crc kubenswrapper[4774]: I1003 15:05:34.395649 4774 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/136d617b-f485-4841-b6e2-350b591cd22e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 15:05:34 crc kubenswrapper[4774]: I1003 15:05:34.796279 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh" event={"ID":"136d617b-f485-4841-b6e2-350b591cd22e","Type":"ContainerDied","Data":"d38b51ee166811c6507d75941cc05f30bc38e1e50113c21b938e6741ef1aa7d9"} Oct 03 15:05:34 crc kubenswrapper[4774]: I1003 15:05:34.796642 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d38b51ee166811c6507d75941cc05f30bc38e1e50113c21b938e6741ef1aa7d9" Oct 03 15:05:34 crc kubenswrapper[4774]: I1003 15:05:34.796441 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh" Oct 03 15:05:34 crc kubenswrapper[4774]: I1003 15:05:34.878708 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-vchpx"] Oct 03 15:05:34 crc kubenswrapper[4774]: E1003 15:05:34.879130 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="136d617b-f485-4841-b6e2-350b591cd22e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 03 15:05:34 crc kubenswrapper[4774]: I1003 15:05:34.879153 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="136d617b-f485-4841-b6e2-350b591cd22e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 03 15:05:34 crc kubenswrapper[4774]: I1003 15:05:34.879492 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="136d617b-f485-4841-b6e2-350b591cd22e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 03 15:05:34 crc kubenswrapper[4774]: I1003 15:05:34.880152 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vchpx" Oct 03 15:05:34 crc kubenswrapper[4774]: I1003 15:05:34.882198 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 15:05:34 crc kubenswrapper[4774]: I1003 15:05:34.882392 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 15:05:34 crc kubenswrapper[4774]: I1003 15:05:34.882640 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 15:05:34 crc kubenswrapper[4774]: I1003 15:05:34.883911 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7bdzq" Oct 03 15:05:34 crc kubenswrapper[4774]: I1003 15:05:34.896492 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-vchpx"] Oct 03 15:05:34 crc kubenswrapper[4774]: I1003 15:05:34.916835 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d2ae95f-a86b-4b58-a529-7b5d426bff79-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-vchpx\" (UID: \"4d2ae95f-a86b-4b58-a529-7b5d426bff79\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vchpx" Oct 03 15:05:34 crc kubenswrapper[4774]: I1003 15:05:34.917247 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk2g7\" (UniqueName: \"kubernetes.io/projected/4d2ae95f-a86b-4b58-a529-7b5d426bff79-kube-api-access-rk2g7\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-vchpx\" (UID: \"4d2ae95f-a86b-4b58-a529-7b5d426bff79\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vchpx" Oct 03 15:05:34 crc kubenswrapper[4774]: I1003 15:05:34.917300 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d2ae95f-a86b-4b58-a529-7b5d426bff79-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-vchpx\" (UID: \"4d2ae95f-a86b-4b58-a529-7b5d426bff79\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vchpx" Oct 03 15:05:35 crc kubenswrapper[4774]: I1003 15:05:35.018588 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk2g7\" (UniqueName: \"kubernetes.io/projected/4d2ae95f-a86b-4b58-a529-7b5d426bff79-kube-api-access-rk2g7\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-vchpx\" (UID: \"4d2ae95f-a86b-4b58-a529-7b5d426bff79\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vchpx" Oct 03 15:05:35 crc kubenswrapper[4774]: I1003 15:05:35.018637 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d2ae95f-a86b-4b58-a529-7b5d426bff79-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-vchpx\" (UID: \"4d2ae95f-a86b-4b58-a529-7b5d426bff79\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vchpx" Oct 03 15:05:35 crc kubenswrapper[4774]: I1003 15:05:35.018703 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d2ae95f-a86b-4b58-a529-7b5d426bff79-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-vchpx\" (UID: \"4d2ae95f-a86b-4b58-a529-7b5d426bff79\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vchpx" Oct 03 15:05:35 crc kubenswrapper[4774]: I1003 15:05:35.024323 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d2ae95f-a86b-4b58-a529-7b5d426bff79-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-vchpx\" (UID: \"4d2ae95f-a86b-4b58-a529-7b5d426bff79\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vchpx" Oct 03 15:05:35 crc kubenswrapper[4774]: I1003 15:05:35.024325 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d2ae95f-a86b-4b58-a529-7b5d426bff79-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-vchpx\" (UID: \"4d2ae95f-a86b-4b58-a529-7b5d426bff79\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vchpx" Oct 03 15:05:35 crc kubenswrapper[4774]: I1003 15:05:35.041955 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk2g7\" (UniqueName: \"kubernetes.io/projected/4d2ae95f-a86b-4b58-a529-7b5d426bff79-kube-api-access-rk2g7\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-vchpx\" (UID: \"4d2ae95f-a86b-4b58-a529-7b5d426bff79\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vchpx" Oct 03 15:05:35 crc kubenswrapper[4774]: I1003 15:05:35.201790 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vchpx" Oct 03 15:05:35 crc kubenswrapper[4774]: I1003 15:05:35.737135 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-vchpx"] Oct 03 15:05:35 crc kubenswrapper[4774]: I1003 15:05:35.808895 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vchpx" event={"ID":"4d2ae95f-a86b-4b58-a529-7b5d426bff79","Type":"ContainerStarted","Data":"bbf6ce4b8fcae3a470843621c6fc5be5bf64686938933ad353bdfd80eac4d3cb"} Oct 03 15:05:36 crc kubenswrapper[4774]: I1003 15:05:36.819462 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vchpx" event={"ID":"4d2ae95f-a86b-4b58-a529-7b5d426bff79","Type":"ContainerStarted","Data":"1ed8a4b7abadcfa0b3495d9c4d56ad2e43da57695c922c056dfdab401b2b0ce0"} Oct 03 15:05:36 crc kubenswrapper[4774]: I1003 15:05:36.848843 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vchpx" podStartSLOduration=2.401288209 podStartE2EDuration="2.848824081s" podCreationTimestamp="2025-10-03 15:05:34 +0000 UTC" firstStartedPulling="2025-10-03 15:05:35.744004116 +0000 UTC m=+1358.333207578" lastFinishedPulling="2025-10-03 15:05:36.191539988 +0000 UTC m=+1358.780743450" observedRunningTime="2025-10-03 15:05:36.84677562 +0000 UTC m=+1359.435979082" watchObservedRunningTime="2025-10-03 15:05:36.848824081 +0000 UTC m=+1359.438027533" Oct 03 15:05:39 crc kubenswrapper[4774]: I1003 15:05:39.847744 4774 generic.go:334] "Generic (PLEG): container finished" podID="4d2ae95f-a86b-4b58-a529-7b5d426bff79" containerID="1ed8a4b7abadcfa0b3495d9c4d56ad2e43da57695c922c056dfdab401b2b0ce0" exitCode=0 Oct 03 15:05:39 crc kubenswrapper[4774]: I1003 15:05:39.847842 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vchpx" event={"ID":"4d2ae95f-a86b-4b58-a529-7b5d426bff79","Type":"ContainerDied","Data":"1ed8a4b7abadcfa0b3495d9c4d56ad2e43da57695c922c056dfdab401b2b0ce0"} Oct 03 15:05:41 crc kubenswrapper[4774]: I1003 15:05:41.296478 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vchpx" Oct 03 15:05:41 crc kubenswrapper[4774]: I1003 15:05:41.352971 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk2g7\" (UniqueName: \"kubernetes.io/projected/4d2ae95f-a86b-4b58-a529-7b5d426bff79-kube-api-access-rk2g7\") pod \"4d2ae95f-a86b-4b58-a529-7b5d426bff79\" (UID: \"4d2ae95f-a86b-4b58-a529-7b5d426bff79\") " Oct 03 15:05:41 crc kubenswrapper[4774]: I1003 15:05:41.353180 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d2ae95f-a86b-4b58-a529-7b5d426bff79-ssh-key\") pod \"4d2ae95f-a86b-4b58-a529-7b5d426bff79\" (UID: \"4d2ae95f-a86b-4b58-a529-7b5d426bff79\") " Oct 03 15:05:41 crc kubenswrapper[4774]: I1003 15:05:41.353239 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d2ae95f-a86b-4b58-a529-7b5d426bff79-inventory\") pod \"4d2ae95f-a86b-4b58-a529-7b5d426bff79\" (UID: \"4d2ae95f-a86b-4b58-a529-7b5d426bff79\") " Oct 03 15:05:41 crc kubenswrapper[4774]: I1003 15:05:41.359393 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d2ae95f-a86b-4b58-a529-7b5d426bff79-kube-api-access-rk2g7" (OuterVolumeSpecName: "kube-api-access-rk2g7") pod "4d2ae95f-a86b-4b58-a529-7b5d426bff79" (UID: "4d2ae95f-a86b-4b58-a529-7b5d426bff79"). InnerVolumeSpecName "kube-api-access-rk2g7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:05:41 crc kubenswrapper[4774]: I1003 15:05:41.385697 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d2ae95f-a86b-4b58-a529-7b5d426bff79-inventory" (OuterVolumeSpecName: "inventory") pod "4d2ae95f-a86b-4b58-a529-7b5d426bff79" (UID: "4d2ae95f-a86b-4b58-a529-7b5d426bff79"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:05:41 crc kubenswrapper[4774]: I1003 15:05:41.393697 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d2ae95f-a86b-4b58-a529-7b5d426bff79-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4d2ae95f-a86b-4b58-a529-7b5d426bff79" (UID: "4d2ae95f-a86b-4b58-a529-7b5d426bff79"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:05:41 crc kubenswrapper[4774]: I1003 15:05:41.456084 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rk2g7\" (UniqueName: \"kubernetes.io/projected/4d2ae95f-a86b-4b58-a529-7b5d426bff79-kube-api-access-rk2g7\") on node \"crc\" DevicePath \"\"" Oct 03 15:05:41 crc kubenswrapper[4774]: I1003 15:05:41.456124 4774 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d2ae95f-a86b-4b58-a529-7b5d426bff79-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 15:05:41 crc kubenswrapper[4774]: I1003 15:05:41.456136 4774 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d2ae95f-a86b-4b58-a529-7b5d426bff79-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 15:05:41 crc kubenswrapper[4774]: I1003 15:05:41.870763 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vchpx" event={"ID":"4d2ae95f-a86b-4b58-a529-7b5d426bff79","Type":"ContainerDied","Data":"bbf6ce4b8fcae3a470843621c6fc5be5bf64686938933ad353bdfd80eac4d3cb"} Oct 03 15:05:41 crc kubenswrapper[4774]: I1003 15:05:41.870801 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbf6ce4b8fcae3a470843621c6fc5be5bf64686938933ad353bdfd80eac4d3cb" Oct 03 15:05:41 crc kubenswrapper[4774]: I1003 15:05:41.870832 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vchpx" Oct 03 15:05:41 crc kubenswrapper[4774]: I1003 15:05:41.974732 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb"] Oct 03 15:05:41 crc kubenswrapper[4774]: E1003 15:05:41.975248 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d2ae95f-a86b-4b58-a529-7b5d426bff79" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 03 15:05:41 crc kubenswrapper[4774]: I1003 15:05:41.975278 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d2ae95f-a86b-4b58-a529-7b5d426bff79" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 03 15:05:41 crc kubenswrapper[4774]: I1003 15:05:41.975553 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d2ae95f-a86b-4b58-a529-7b5d426bff79" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 03 15:05:41 crc kubenswrapper[4774]: I1003 15:05:41.976297 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb" Oct 03 15:05:41 crc kubenswrapper[4774]: I1003 15:05:41.978697 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 15:05:41 crc kubenswrapper[4774]: I1003 15:05:41.979455 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 15:05:41 crc kubenswrapper[4774]: I1003 15:05:41.980684 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7bdzq" Oct 03 15:05:41 crc kubenswrapper[4774]: I1003 15:05:41.987168 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 15:05:41 crc kubenswrapper[4774]: I1003 15:05:41.996143 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb"] Oct 03 15:05:42 crc kubenswrapper[4774]: I1003 15:05:42.065018 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d70dce54-aa22-4af1-a341-4ff90ba78722-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb\" (UID: \"d70dce54-aa22-4af1-a341-4ff90ba78722\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb" Oct 03 15:05:42 crc kubenswrapper[4774]: I1003 15:05:42.065149 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d70dce54-aa22-4af1-a341-4ff90ba78722-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb\" (UID: \"d70dce54-aa22-4af1-a341-4ff90ba78722\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb" Oct 03 15:05:42 crc kubenswrapper[4774]: I1003 15:05:42.065240 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5ftm\" (UniqueName: \"kubernetes.io/projected/d70dce54-aa22-4af1-a341-4ff90ba78722-kube-api-access-g5ftm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb\" (UID: \"d70dce54-aa22-4af1-a341-4ff90ba78722\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb" Oct 03 15:05:42 crc kubenswrapper[4774]: I1003 15:05:42.065317 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d70dce54-aa22-4af1-a341-4ff90ba78722-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb\" (UID: \"d70dce54-aa22-4af1-a341-4ff90ba78722\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb" Oct 03 15:05:42 crc kubenswrapper[4774]: I1003 15:05:42.167996 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d70dce54-aa22-4af1-a341-4ff90ba78722-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb\" (UID: \"d70dce54-aa22-4af1-a341-4ff90ba78722\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb" Oct 03 15:05:42 crc kubenswrapper[4774]: I1003 15:05:42.168094 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5ftm\" (UniqueName: \"kubernetes.io/projected/d70dce54-aa22-4af1-a341-4ff90ba78722-kube-api-access-g5ftm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb\" (UID: \"d70dce54-aa22-4af1-a341-4ff90ba78722\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb" Oct 03 15:05:42 crc kubenswrapper[4774]: I1003 15:05:42.168153 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d70dce54-aa22-4af1-a341-4ff90ba78722-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb\" (UID: \"d70dce54-aa22-4af1-a341-4ff90ba78722\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb" Oct 03 15:05:42 crc kubenswrapper[4774]: I1003 15:05:42.168245 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d70dce54-aa22-4af1-a341-4ff90ba78722-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb\" (UID: \"d70dce54-aa22-4af1-a341-4ff90ba78722\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb" Oct 03 15:05:42 crc kubenswrapper[4774]: I1003 15:05:42.174103 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d70dce54-aa22-4af1-a341-4ff90ba78722-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb\" (UID: \"d70dce54-aa22-4af1-a341-4ff90ba78722\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb" Oct 03 15:05:42 crc kubenswrapper[4774]: I1003 15:05:42.177921 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d70dce54-aa22-4af1-a341-4ff90ba78722-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb\" (UID: \"d70dce54-aa22-4af1-a341-4ff90ba78722\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb" Oct 03 15:05:42 crc kubenswrapper[4774]: I1003 15:05:42.178064 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d70dce54-aa22-4af1-a341-4ff90ba78722-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb\" (UID: \"d70dce54-aa22-4af1-a341-4ff90ba78722\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb" Oct 03 15:05:42 crc kubenswrapper[4774]: I1003 15:05:42.190910 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5ftm\" (UniqueName: \"kubernetes.io/projected/d70dce54-aa22-4af1-a341-4ff90ba78722-kube-api-access-g5ftm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb\" (UID: \"d70dce54-aa22-4af1-a341-4ff90ba78722\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb" Oct 03 15:05:42 crc kubenswrapper[4774]: I1003 15:05:42.295062 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb" Oct 03 15:05:42 crc kubenswrapper[4774]: I1003 15:05:42.813740 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb"] Oct 03 15:05:42 crc kubenswrapper[4774]: W1003 15:05:42.836173 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd70dce54_aa22_4af1_a341_4ff90ba78722.slice/crio-c7eae101dbe0a6da1835da0ce05097ab0a807df3cb4b133d297cad9690661119 WatchSource:0}: Error finding container c7eae101dbe0a6da1835da0ce05097ab0a807df3cb4b133d297cad9690661119: Status 404 returned error can't find the container with id c7eae101dbe0a6da1835da0ce05097ab0a807df3cb4b133d297cad9690661119 Oct 03 15:05:42 crc kubenswrapper[4774]: I1003 15:05:42.882314 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb" event={"ID":"d70dce54-aa22-4af1-a341-4ff90ba78722","Type":"ContainerStarted","Data":"c7eae101dbe0a6da1835da0ce05097ab0a807df3cb4b133d297cad9690661119"} Oct 03 15:05:43 crc kubenswrapper[4774]: I1003 15:05:43.894523 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb" event={"ID":"d70dce54-aa22-4af1-a341-4ff90ba78722","Type":"ContainerStarted","Data":"16b3dd861d084f153ba8b566db0be66bcc2530c636d4921cc8d373dcd2babe92"} Oct 03 15:05:43 crc kubenswrapper[4774]: I1003 15:05:43.918667 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb" podStartSLOduration=2.403605449 podStartE2EDuration="2.918640831s" podCreationTimestamp="2025-10-03 15:05:41 +0000 UTC" firstStartedPulling="2025-10-03 15:05:42.839264003 +0000 UTC m=+1365.428467465" lastFinishedPulling="2025-10-03 15:05:43.354299395 +0000 UTC m=+1365.943502847" observedRunningTime="2025-10-03 15:05:43.918138209 +0000 UTC m=+1366.507341671" watchObservedRunningTime="2025-10-03 15:05:43.918640831 +0000 UTC m=+1366.507844323" Oct 03 15:05:50 crc kubenswrapper[4774]: I1003 15:05:50.653823 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:05:50 crc kubenswrapper[4774]: I1003 15:05:50.654570 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:06:20 crc kubenswrapper[4774]: I1003 15:06:20.653956 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:06:20 crc kubenswrapper[4774]: I1003 15:06:20.654533 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:06:20 crc kubenswrapper[4774]: I1003 15:06:20.654582 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" Oct 03 15:06:20 crc kubenswrapper[4774]: I1003 15:06:20.655246 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"44818931129eda9720850f3c5b49565e0ee25d9e624e68b358f2ade90a0039e5"} pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 15:06:20 crc kubenswrapper[4774]: I1003 15:06:20.655317 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" containerID="cri-o://44818931129eda9720850f3c5b49565e0ee25d9e624e68b358f2ade90a0039e5" gracePeriod=600 Oct 03 15:06:21 crc kubenswrapper[4774]: I1003 15:06:21.288307 4774 generic.go:334] "Generic (PLEG): container finished" podID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerID="44818931129eda9720850f3c5b49565e0ee25d9e624e68b358f2ade90a0039e5" exitCode=0 Oct 03 15:06:21 crc kubenswrapper[4774]: I1003 15:06:21.288416 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" event={"ID":"ca37ac4b-f421-4198-a179-12901d36f0f5","Type":"ContainerDied","Data":"44818931129eda9720850f3c5b49565e0ee25d9e624e68b358f2ade90a0039e5"} Oct 03 15:06:21 crc kubenswrapper[4774]: I1003 15:06:21.289035 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" event={"ID":"ca37ac4b-f421-4198-a179-12901d36f0f5","Type":"ContainerStarted","Data":"74d4d4fd29e66b3f46bed47cb8482d6e253bf183444fa990f8669734114e0846"} Oct 03 15:06:21 crc kubenswrapper[4774]: I1003 15:06:21.289066 4774 scope.go:117] "RemoveContainer" containerID="3d631133d606feac7cf551b661ad83ff63af803d9385eff9dee1aa2b7ab7a1cd" Oct 03 15:06:23 crc kubenswrapper[4774]: I1003 15:06:23.948690 4774 scope.go:117] "RemoveContainer" containerID="cd3aaf95f01cd2b989fcb7c39289b44ea75ae137d4ce03872d6ecbab64985142" Oct 03 15:07:24 crc kubenswrapper[4774]: I1003 15:07:24.059346 4774 scope.go:117] "RemoveContainer" containerID="243f3d3655fd1960cd360dac4b6b275a2de3079b6eb4a7aa1a8e22aa14983188" Oct 03 15:07:24 crc kubenswrapper[4774]: I1003 15:07:24.103251 4774 scope.go:117] "RemoveContainer" containerID="93db5edd6b278dc571e695d72ddf2d8458116f67d7c550c2f036184cacbf1e4d" Oct 03 15:07:24 crc kubenswrapper[4774]: I1003 15:07:24.169594 4774 scope.go:117] "RemoveContainer" containerID="98b9a01eac0717c48a04feff10917773885c7865c780bf9362a382f4effe7ca8" Oct 03 15:07:24 crc kubenswrapper[4774]: I1003 15:07:24.212713 4774 scope.go:117] "RemoveContainer" containerID="ed4e3b3350eaeb9055c50f98a8537414d0624b18969aab216e1ab174f748523a" Oct 03 15:07:53 crc kubenswrapper[4774]: I1003 15:07:53.949178 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-77hmn"] Oct 03 15:07:53 crc kubenswrapper[4774]: I1003 15:07:53.951696 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-77hmn" Oct 03 15:07:53 crc kubenswrapper[4774]: I1003 15:07:53.959552 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-77hmn"] Oct 03 15:07:54 crc kubenswrapper[4774]: I1003 15:07:54.003541 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d7e75d8-aef6-48ba-8253-f107fe0c262b-utilities\") pod \"redhat-operators-77hmn\" (UID: \"0d7e75d8-aef6-48ba-8253-f107fe0c262b\") " pod="openshift-marketplace/redhat-operators-77hmn" Oct 03 15:07:54 crc kubenswrapper[4774]: I1003 15:07:54.003666 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d7e75d8-aef6-48ba-8253-f107fe0c262b-catalog-content\") pod \"redhat-operators-77hmn\" (UID: \"0d7e75d8-aef6-48ba-8253-f107fe0c262b\") " pod="openshift-marketplace/redhat-operators-77hmn" Oct 03 15:07:54 crc kubenswrapper[4774]: I1003 15:07:54.004063 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz2fz\" (UniqueName: \"kubernetes.io/projected/0d7e75d8-aef6-48ba-8253-f107fe0c262b-kube-api-access-nz2fz\") pod \"redhat-operators-77hmn\" (UID: \"0d7e75d8-aef6-48ba-8253-f107fe0c262b\") " pod="openshift-marketplace/redhat-operators-77hmn" Oct 03 15:07:54 crc kubenswrapper[4774]: I1003 15:07:54.106329 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz2fz\" (UniqueName: \"kubernetes.io/projected/0d7e75d8-aef6-48ba-8253-f107fe0c262b-kube-api-access-nz2fz\") pod \"redhat-operators-77hmn\" (UID: \"0d7e75d8-aef6-48ba-8253-f107fe0c262b\") " pod="openshift-marketplace/redhat-operators-77hmn" Oct 03 15:07:54 crc kubenswrapper[4774]: I1003 15:07:54.106443 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d7e75d8-aef6-48ba-8253-f107fe0c262b-utilities\") pod \"redhat-operators-77hmn\" (UID: \"0d7e75d8-aef6-48ba-8253-f107fe0c262b\") " pod="openshift-marketplace/redhat-operators-77hmn" Oct 03 15:07:54 crc kubenswrapper[4774]: I1003 15:07:54.106488 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d7e75d8-aef6-48ba-8253-f107fe0c262b-catalog-content\") pod \"redhat-operators-77hmn\" (UID: \"0d7e75d8-aef6-48ba-8253-f107fe0c262b\") " pod="openshift-marketplace/redhat-operators-77hmn" Oct 03 15:07:54 crc kubenswrapper[4774]: I1003 15:07:54.107016 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d7e75d8-aef6-48ba-8253-f107fe0c262b-catalog-content\") pod \"redhat-operators-77hmn\" (UID: \"0d7e75d8-aef6-48ba-8253-f107fe0c262b\") " pod="openshift-marketplace/redhat-operators-77hmn" Oct 03 15:07:54 crc kubenswrapper[4774]: I1003 15:07:54.107121 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d7e75d8-aef6-48ba-8253-f107fe0c262b-utilities\") pod \"redhat-operators-77hmn\" (UID: \"0d7e75d8-aef6-48ba-8253-f107fe0c262b\") " pod="openshift-marketplace/redhat-operators-77hmn" Oct 03 15:07:54 crc kubenswrapper[4774]: I1003 15:07:54.134470 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz2fz\" (UniqueName: \"kubernetes.io/projected/0d7e75d8-aef6-48ba-8253-f107fe0c262b-kube-api-access-nz2fz\") pod \"redhat-operators-77hmn\" (UID: \"0d7e75d8-aef6-48ba-8253-f107fe0c262b\") " pod="openshift-marketplace/redhat-operators-77hmn" Oct 03 15:07:54 crc kubenswrapper[4774]: I1003 15:07:54.280255 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-77hmn" Oct 03 15:07:54 crc kubenswrapper[4774]: I1003 15:07:54.753105 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-77hmn"] Oct 03 15:07:55 crc kubenswrapper[4774]: I1003 15:07:55.358827 4774 generic.go:334] "Generic (PLEG): container finished" podID="0d7e75d8-aef6-48ba-8253-f107fe0c262b" containerID="9a5d09b5800fb8ca82dd6ae69419260b543690d0519335d5dced5aadc2686b2e" exitCode=0 Oct 03 15:07:55 crc kubenswrapper[4774]: I1003 15:07:55.358868 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-77hmn" event={"ID":"0d7e75d8-aef6-48ba-8253-f107fe0c262b","Type":"ContainerDied","Data":"9a5d09b5800fb8ca82dd6ae69419260b543690d0519335d5dced5aadc2686b2e"} Oct 03 15:07:55 crc kubenswrapper[4774]: I1003 15:07:55.358891 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-77hmn" event={"ID":"0d7e75d8-aef6-48ba-8253-f107fe0c262b","Type":"ContainerStarted","Data":"5340660e6fefeb4ee8de4461edfd19501d1445cf37dd2dd419e1465e67b16290"} Oct 03 15:07:57 crc kubenswrapper[4774]: I1003 15:07:57.379141 4774 generic.go:334] "Generic (PLEG): container finished" podID="0d7e75d8-aef6-48ba-8253-f107fe0c262b" containerID="8821e13868cea7258c835c1d136c8946e631d45557744f73deaeecb87192d89d" exitCode=0 Oct 03 15:07:57 crc kubenswrapper[4774]: I1003 15:07:57.379275 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-77hmn" event={"ID":"0d7e75d8-aef6-48ba-8253-f107fe0c262b","Type":"ContainerDied","Data":"8821e13868cea7258c835c1d136c8946e631d45557744f73deaeecb87192d89d"} Oct 03 15:08:01 crc kubenswrapper[4774]: I1003 15:08:01.413921 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-77hmn" event={"ID":"0d7e75d8-aef6-48ba-8253-f107fe0c262b","Type":"ContainerStarted","Data":"7fc4ccaae6f43b0a5543f49e8b741c4730bd4afb6c01689e0365d34e11a9de43"} Oct 03 15:08:01 crc kubenswrapper[4774]: I1003 15:08:01.433072 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-77hmn" podStartSLOduration=2.935593544 podStartE2EDuration="8.433052162s" podCreationTimestamp="2025-10-03 15:07:53 +0000 UTC" firstStartedPulling="2025-10-03 15:07:55.360189673 +0000 UTC m=+1497.949393125" lastFinishedPulling="2025-10-03 15:08:00.857648271 +0000 UTC m=+1503.446851743" observedRunningTime="2025-10-03 15:08:01.430823296 +0000 UTC m=+1504.020026768" watchObservedRunningTime="2025-10-03 15:08:01.433052162 +0000 UTC m=+1504.022255614" Oct 03 15:08:04 crc kubenswrapper[4774]: I1003 15:08:04.281424 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-77hmn" Oct 03 15:08:04 crc kubenswrapper[4774]: I1003 15:08:04.281728 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-77hmn" Oct 03 15:08:05 crc kubenswrapper[4774]: I1003 15:08:05.351015 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-77hmn" podUID="0d7e75d8-aef6-48ba-8253-f107fe0c262b" containerName="registry-server" probeResult="failure" output=< Oct 03 15:08:05 crc kubenswrapper[4774]: timeout: failed to connect service ":50051" within 1s Oct 03 15:08:05 crc kubenswrapper[4774]: > Oct 03 15:08:12 crc kubenswrapper[4774]: I1003 15:08:12.811948 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q7l4f"] Oct 03 15:08:12 crc kubenswrapper[4774]: I1003 15:08:12.817087 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q7l4f" Oct 03 15:08:12 crc kubenswrapper[4774]: I1003 15:08:12.867236 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q7l4f"] Oct 03 15:08:12 crc kubenswrapper[4774]: I1003 15:08:12.936352 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf-utilities\") pod \"redhat-marketplace-q7l4f\" (UID: \"4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf\") " pod="openshift-marketplace/redhat-marketplace-q7l4f" Oct 03 15:08:12 crc kubenswrapper[4774]: I1003 15:08:12.936467 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf-catalog-content\") pod \"redhat-marketplace-q7l4f\" (UID: \"4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf\") " pod="openshift-marketplace/redhat-marketplace-q7l4f" Oct 03 15:08:12 crc kubenswrapper[4774]: I1003 15:08:12.936489 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx46v\" (UniqueName: \"kubernetes.io/projected/4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf-kube-api-access-rx46v\") pod \"redhat-marketplace-q7l4f\" (UID: \"4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf\") " pod="openshift-marketplace/redhat-marketplace-q7l4f" Oct 03 15:08:13 crc kubenswrapper[4774]: I1003 15:08:13.037564 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf-utilities\") pod \"redhat-marketplace-q7l4f\" (UID: \"4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf\") " pod="openshift-marketplace/redhat-marketplace-q7l4f" Oct 03 15:08:13 crc kubenswrapper[4774]: I1003 15:08:13.037640 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf-catalog-content\") pod \"redhat-marketplace-q7l4f\" (UID: \"4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf\") " pod="openshift-marketplace/redhat-marketplace-q7l4f" Oct 03 15:08:13 crc kubenswrapper[4774]: I1003 15:08:13.037659 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx46v\" (UniqueName: \"kubernetes.io/projected/4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf-kube-api-access-rx46v\") pod \"redhat-marketplace-q7l4f\" (UID: \"4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf\") " pod="openshift-marketplace/redhat-marketplace-q7l4f" Oct 03 15:08:13 crc kubenswrapper[4774]: I1003 15:08:13.038629 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf-utilities\") pod \"redhat-marketplace-q7l4f\" (UID: \"4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf\") " pod="openshift-marketplace/redhat-marketplace-q7l4f" Oct 03 15:08:13 crc kubenswrapper[4774]: I1003 15:08:13.039035 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf-catalog-content\") pod \"redhat-marketplace-q7l4f\" (UID: \"4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf\") " pod="openshift-marketplace/redhat-marketplace-q7l4f" Oct 03 15:08:13 crc kubenswrapper[4774]: I1003 15:08:13.063189 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx46v\" (UniqueName: \"kubernetes.io/projected/4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf-kube-api-access-rx46v\") pod \"redhat-marketplace-q7l4f\" (UID: \"4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf\") " pod="openshift-marketplace/redhat-marketplace-q7l4f" Oct 03 15:08:13 crc kubenswrapper[4774]: I1003 15:08:13.163343 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q7l4f" Oct 03 15:08:13 crc kubenswrapper[4774]: W1003 15:08:13.661922 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bd8ef87_b6a6_4cf9_9c92_d60da2bea3bf.slice/crio-96666e7464d3f4b3b3168bb1d3ef662b783571be44c3a3fcd832b44bb322992d WatchSource:0}: Error finding container 96666e7464d3f4b3b3168bb1d3ef662b783571be44c3a3fcd832b44bb322992d: Status 404 returned error can't find the container with id 96666e7464d3f4b3b3168bb1d3ef662b783571be44c3a3fcd832b44bb322992d Oct 03 15:08:13 crc kubenswrapper[4774]: I1003 15:08:13.663528 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q7l4f"] Oct 03 15:08:14 crc kubenswrapper[4774]: I1003 15:08:14.325228 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-77hmn" Oct 03 15:08:14 crc kubenswrapper[4774]: I1003 15:08:14.387779 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-77hmn" Oct 03 15:08:14 crc kubenswrapper[4774]: I1003 15:08:14.534280 4774 generic.go:334] "Generic (PLEG): container finished" podID="4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf" containerID="9181351fbcbc138b64aac0c8050295991c249cd12ebbdd26ff2c217ba63f1574" exitCode=0 Oct 03 15:08:14 crc kubenswrapper[4774]: I1003 15:08:14.534343 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q7l4f" event={"ID":"4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf","Type":"ContainerDied","Data":"9181351fbcbc138b64aac0c8050295991c249cd12ebbdd26ff2c217ba63f1574"} Oct 03 15:08:14 crc kubenswrapper[4774]: I1003 15:08:14.534429 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q7l4f" event={"ID":"4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf","Type":"ContainerStarted","Data":"96666e7464d3f4b3b3168bb1d3ef662b783571be44c3a3fcd832b44bb322992d"} Oct 03 15:08:16 crc kubenswrapper[4774]: I1003 15:08:16.215953 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vwptl"] Oct 03 15:08:16 crc kubenswrapper[4774]: I1003 15:08:16.227361 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vwptl" Oct 03 15:08:16 crc kubenswrapper[4774]: I1003 15:08:16.262514 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vwptl"] Oct 03 15:08:16 crc kubenswrapper[4774]: I1003 15:08:16.403205 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cf2z\" (UniqueName: \"kubernetes.io/projected/1b2b2241-4922-45e7-bfce-72e4ce7076c7-kube-api-access-4cf2z\") pod \"certified-operators-vwptl\" (UID: \"1b2b2241-4922-45e7-bfce-72e4ce7076c7\") " pod="openshift-marketplace/certified-operators-vwptl" Oct 03 15:08:16 crc kubenswrapper[4774]: I1003 15:08:16.404356 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b2b2241-4922-45e7-bfce-72e4ce7076c7-catalog-content\") pod \"certified-operators-vwptl\" (UID: \"1b2b2241-4922-45e7-bfce-72e4ce7076c7\") " pod="openshift-marketplace/certified-operators-vwptl" Oct 03 15:08:16 crc kubenswrapper[4774]: I1003 15:08:16.404981 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b2b2241-4922-45e7-bfce-72e4ce7076c7-utilities\") pod \"certified-operators-vwptl\" (UID: \"1b2b2241-4922-45e7-bfce-72e4ce7076c7\") " pod="openshift-marketplace/certified-operators-vwptl" Oct 03 15:08:16 crc kubenswrapper[4774]: I1003 15:08:16.506533 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b2b2241-4922-45e7-bfce-72e4ce7076c7-catalog-content\") pod \"certified-operators-vwptl\" (UID: \"1b2b2241-4922-45e7-bfce-72e4ce7076c7\") " pod="openshift-marketplace/certified-operators-vwptl" Oct 03 15:08:16 crc kubenswrapper[4774]: I1003 15:08:16.506625 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b2b2241-4922-45e7-bfce-72e4ce7076c7-utilities\") pod \"certified-operators-vwptl\" (UID: \"1b2b2241-4922-45e7-bfce-72e4ce7076c7\") " pod="openshift-marketplace/certified-operators-vwptl" Oct 03 15:08:16 crc kubenswrapper[4774]: I1003 15:08:16.506672 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cf2z\" (UniqueName: \"kubernetes.io/projected/1b2b2241-4922-45e7-bfce-72e4ce7076c7-kube-api-access-4cf2z\") pod \"certified-operators-vwptl\" (UID: \"1b2b2241-4922-45e7-bfce-72e4ce7076c7\") " pod="openshift-marketplace/certified-operators-vwptl" Oct 03 15:08:16 crc kubenswrapper[4774]: I1003 15:08:16.507109 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b2b2241-4922-45e7-bfce-72e4ce7076c7-catalog-content\") pod \"certified-operators-vwptl\" (UID: \"1b2b2241-4922-45e7-bfce-72e4ce7076c7\") " pod="openshift-marketplace/certified-operators-vwptl" Oct 03 15:08:16 crc kubenswrapper[4774]: I1003 15:08:16.507411 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b2b2241-4922-45e7-bfce-72e4ce7076c7-utilities\") pod \"certified-operators-vwptl\" (UID: \"1b2b2241-4922-45e7-bfce-72e4ce7076c7\") " pod="openshift-marketplace/certified-operators-vwptl" Oct 03 15:08:16 crc kubenswrapper[4774]: I1003 15:08:16.528404 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cf2z\" (UniqueName: \"kubernetes.io/projected/1b2b2241-4922-45e7-bfce-72e4ce7076c7-kube-api-access-4cf2z\") pod \"certified-operators-vwptl\" (UID: \"1b2b2241-4922-45e7-bfce-72e4ce7076c7\") " pod="openshift-marketplace/certified-operators-vwptl" Oct 03 15:08:16 crc kubenswrapper[4774]: I1003 15:08:16.563307 4774 generic.go:334] "Generic (PLEG): container finished" podID="4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf" containerID="fb75149ce2a8ebf389a3121b8b0f1e929ba535770d73341bbb1a5c8419f87ca0" exitCode=0 Oct 03 15:08:16 crc kubenswrapper[4774]: I1003 15:08:16.563359 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q7l4f" event={"ID":"4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf","Type":"ContainerDied","Data":"fb75149ce2a8ebf389a3121b8b0f1e929ba535770d73341bbb1a5c8419f87ca0"} Oct 03 15:08:16 crc kubenswrapper[4774]: I1003 15:08:16.566344 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vwptl" Oct 03 15:08:17 crc kubenswrapper[4774]: I1003 15:08:17.094346 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vwptl"] Oct 03 15:08:17 crc kubenswrapper[4774]: W1003 15:08:17.103040 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b2b2241_4922_45e7_bfce_72e4ce7076c7.slice/crio-07c52ccacc78ec27b40e81485f322210d06d5b9704d12bb5bd0404e30f2cdc51 WatchSource:0}: Error finding container 07c52ccacc78ec27b40e81485f322210d06d5b9704d12bb5bd0404e30f2cdc51: Status 404 returned error can't find the container with id 07c52ccacc78ec27b40e81485f322210d06d5b9704d12bb5bd0404e30f2cdc51 Oct 03 15:08:17 crc kubenswrapper[4774]: I1003 15:08:17.575263 4774 generic.go:334] "Generic (PLEG): container finished" podID="1b2b2241-4922-45e7-bfce-72e4ce7076c7" containerID="03c2f2870ca3db252b3ba9c3153bd35995cb801cbee2f93d053ce2bc5e55cf8f" exitCode=0 Oct 03 15:08:17 crc kubenswrapper[4774]: I1003 15:08:17.575309 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwptl" event={"ID":"1b2b2241-4922-45e7-bfce-72e4ce7076c7","Type":"ContainerDied","Data":"03c2f2870ca3db252b3ba9c3153bd35995cb801cbee2f93d053ce2bc5e55cf8f"} Oct 03 15:08:17 crc kubenswrapper[4774]: I1003 15:08:17.575619 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwptl" event={"ID":"1b2b2241-4922-45e7-bfce-72e4ce7076c7","Type":"ContainerStarted","Data":"07c52ccacc78ec27b40e81485f322210d06d5b9704d12bb5bd0404e30f2cdc51"} Oct 03 15:08:17 crc kubenswrapper[4774]: I1003 15:08:17.578654 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q7l4f" event={"ID":"4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf","Type":"ContainerStarted","Data":"2a4bfc5e85754eead8634748a601025c5b263c7975b481e0ed94d7198740d67e"} Oct 03 15:08:17 crc kubenswrapper[4774]: I1003 15:08:17.646671 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q7l4f" podStartSLOduration=3.183010153 podStartE2EDuration="5.646637821s" podCreationTimestamp="2025-10-03 15:08:12 +0000 UTC" firstStartedPulling="2025-10-03 15:08:14.536765783 +0000 UTC m=+1517.125969245" lastFinishedPulling="2025-10-03 15:08:17.000393421 +0000 UTC m=+1519.589596913" observedRunningTime="2025-10-03 15:08:17.643846522 +0000 UTC m=+1520.233049974" watchObservedRunningTime="2025-10-03 15:08:17.646637821 +0000 UTC m=+1520.235841273" Oct 03 15:08:18 crc kubenswrapper[4774]: I1003 15:08:18.386807 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-77hmn"] Oct 03 15:08:18 crc kubenswrapper[4774]: I1003 15:08:18.387436 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-77hmn" podUID="0d7e75d8-aef6-48ba-8253-f107fe0c262b" containerName="registry-server" containerID="cri-o://7fc4ccaae6f43b0a5543f49e8b741c4730bd4afb6c01689e0365d34e11a9de43" gracePeriod=2 Oct 03 15:08:18 crc kubenswrapper[4774]: I1003 15:08:18.590179 4774 generic.go:334] "Generic (PLEG): container finished" podID="0d7e75d8-aef6-48ba-8253-f107fe0c262b" containerID="7fc4ccaae6f43b0a5543f49e8b741c4730bd4afb6c01689e0365d34e11a9de43" exitCode=0 Oct 03 15:08:18 crc kubenswrapper[4774]: I1003 15:08:18.590236 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-77hmn" event={"ID":"0d7e75d8-aef6-48ba-8253-f107fe0c262b","Type":"ContainerDied","Data":"7fc4ccaae6f43b0a5543f49e8b741c4730bd4afb6c01689e0365d34e11a9de43"} Oct 03 15:08:18 crc kubenswrapper[4774]: I1003 15:08:18.820730 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-77hmn" Oct 03 15:08:18 crc kubenswrapper[4774]: I1003 15:08:18.957274 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d7e75d8-aef6-48ba-8253-f107fe0c262b-catalog-content\") pod \"0d7e75d8-aef6-48ba-8253-f107fe0c262b\" (UID: \"0d7e75d8-aef6-48ba-8253-f107fe0c262b\") " Oct 03 15:08:18 crc kubenswrapper[4774]: I1003 15:08:18.957445 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d7e75d8-aef6-48ba-8253-f107fe0c262b-utilities\") pod \"0d7e75d8-aef6-48ba-8253-f107fe0c262b\" (UID: \"0d7e75d8-aef6-48ba-8253-f107fe0c262b\") " Oct 03 15:08:18 crc kubenswrapper[4774]: I1003 15:08:18.957576 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nz2fz\" (UniqueName: \"kubernetes.io/projected/0d7e75d8-aef6-48ba-8253-f107fe0c262b-kube-api-access-nz2fz\") pod \"0d7e75d8-aef6-48ba-8253-f107fe0c262b\" (UID: \"0d7e75d8-aef6-48ba-8253-f107fe0c262b\") " Oct 03 15:08:18 crc kubenswrapper[4774]: I1003 15:08:18.958258 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d7e75d8-aef6-48ba-8253-f107fe0c262b-utilities" (OuterVolumeSpecName: "utilities") pod "0d7e75d8-aef6-48ba-8253-f107fe0c262b" (UID: "0d7e75d8-aef6-48ba-8253-f107fe0c262b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:08:18 crc kubenswrapper[4774]: I1003 15:08:18.965171 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d7e75d8-aef6-48ba-8253-f107fe0c262b-kube-api-access-nz2fz" (OuterVolumeSpecName: "kube-api-access-nz2fz") pod "0d7e75d8-aef6-48ba-8253-f107fe0c262b" (UID: "0d7e75d8-aef6-48ba-8253-f107fe0c262b"). InnerVolumeSpecName "kube-api-access-nz2fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:08:19 crc kubenswrapper[4774]: I1003 15:08:19.040423 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d7e75d8-aef6-48ba-8253-f107fe0c262b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0d7e75d8-aef6-48ba-8253-f107fe0c262b" (UID: "0d7e75d8-aef6-48ba-8253-f107fe0c262b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:08:19 crc kubenswrapper[4774]: I1003 15:08:19.060168 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d7e75d8-aef6-48ba-8253-f107fe0c262b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 15:08:19 crc kubenswrapper[4774]: I1003 15:08:19.060205 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d7e75d8-aef6-48ba-8253-f107fe0c262b-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 15:08:19 crc kubenswrapper[4774]: I1003 15:08:19.060216 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nz2fz\" (UniqueName: \"kubernetes.io/projected/0d7e75d8-aef6-48ba-8253-f107fe0c262b-kube-api-access-nz2fz\") on node \"crc\" DevicePath \"\"" Oct 03 15:08:19 crc kubenswrapper[4774]: I1003 15:08:19.599890 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-77hmn" Oct 03 15:08:19 crc kubenswrapper[4774]: I1003 15:08:19.599895 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-77hmn" event={"ID":"0d7e75d8-aef6-48ba-8253-f107fe0c262b","Type":"ContainerDied","Data":"5340660e6fefeb4ee8de4461edfd19501d1445cf37dd2dd419e1465e67b16290"} Oct 03 15:08:19 crc kubenswrapper[4774]: I1003 15:08:19.600000 4774 scope.go:117] "RemoveContainer" containerID="7fc4ccaae6f43b0a5543f49e8b741c4730bd4afb6c01689e0365d34e11a9de43" Oct 03 15:08:19 crc kubenswrapper[4774]: I1003 15:08:19.601643 4774 generic.go:334] "Generic (PLEG): container finished" podID="1b2b2241-4922-45e7-bfce-72e4ce7076c7" containerID="7353df0ee47bcfa7ef79e324d06b4887dc7255fa7587103be2c1f26dc4303ac9" exitCode=0 Oct 03 15:08:19 crc kubenswrapper[4774]: I1003 15:08:19.601691 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwptl" event={"ID":"1b2b2241-4922-45e7-bfce-72e4ce7076c7","Type":"ContainerDied","Data":"7353df0ee47bcfa7ef79e324d06b4887dc7255fa7587103be2c1f26dc4303ac9"} Oct 03 15:08:19 crc kubenswrapper[4774]: I1003 15:08:19.637913 4774 scope.go:117] "RemoveContainer" containerID="8821e13868cea7258c835c1d136c8946e631d45557744f73deaeecb87192d89d" Oct 03 15:08:19 crc kubenswrapper[4774]: I1003 15:08:19.651302 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-77hmn"] Oct 03 15:08:19 crc kubenswrapper[4774]: I1003 15:08:19.669974 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-77hmn"] Oct 03 15:08:19 crc kubenswrapper[4774]: I1003 15:08:19.674528 4774 scope.go:117] "RemoveContainer" containerID="9a5d09b5800fb8ca82dd6ae69419260b543690d0519335d5dced5aadc2686b2e" Oct 03 15:08:20 crc kubenswrapper[4774]: I1003 15:08:20.615244 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwptl" event={"ID":"1b2b2241-4922-45e7-bfce-72e4ce7076c7","Type":"ContainerStarted","Data":"396b8ed2dde420d3b11be2691f5adbad1c3fc8126e9e02ef810f6dd1d834f4fd"} Oct 03 15:08:20 crc kubenswrapper[4774]: I1003 15:08:20.634630 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vwptl" podStartSLOduration=2.113067055 podStartE2EDuration="4.634611412s" podCreationTimestamp="2025-10-03 15:08:16 +0000 UTC" firstStartedPulling="2025-10-03 15:08:17.578585861 +0000 UTC m=+1520.167789343" lastFinishedPulling="2025-10-03 15:08:20.100130208 +0000 UTC m=+1522.689333700" observedRunningTime="2025-10-03 15:08:20.630740736 +0000 UTC m=+1523.219944208" watchObservedRunningTime="2025-10-03 15:08:20.634611412 +0000 UTC m=+1523.223814874" Oct 03 15:08:20 crc kubenswrapper[4774]: I1003 15:08:20.653757 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:08:20 crc kubenswrapper[4774]: I1003 15:08:20.653830 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:08:20 crc kubenswrapper[4774]: I1003 15:08:20.802107 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-25qgn"] Oct 03 15:08:20 crc kubenswrapper[4774]: E1003 15:08:20.802600 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d7e75d8-aef6-48ba-8253-f107fe0c262b" containerName="extract-content" Oct 03 15:08:20 crc kubenswrapper[4774]: I1003 15:08:20.802632 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d7e75d8-aef6-48ba-8253-f107fe0c262b" containerName="extract-content" Oct 03 15:08:20 crc kubenswrapper[4774]: E1003 15:08:20.802666 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d7e75d8-aef6-48ba-8253-f107fe0c262b" containerName="extract-utilities" Oct 03 15:08:20 crc kubenswrapper[4774]: I1003 15:08:20.802680 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d7e75d8-aef6-48ba-8253-f107fe0c262b" containerName="extract-utilities" Oct 03 15:08:20 crc kubenswrapper[4774]: E1003 15:08:20.802711 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d7e75d8-aef6-48ba-8253-f107fe0c262b" containerName="registry-server" Oct 03 15:08:20 crc kubenswrapper[4774]: I1003 15:08:20.802722 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d7e75d8-aef6-48ba-8253-f107fe0c262b" containerName="registry-server" Oct 03 15:08:20 crc kubenswrapper[4774]: I1003 15:08:20.803004 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d7e75d8-aef6-48ba-8253-f107fe0c262b" containerName="registry-server" Oct 03 15:08:20 crc kubenswrapper[4774]: I1003 15:08:20.804992 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-25qgn" Oct 03 15:08:20 crc kubenswrapper[4774]: I1003 15:08:20.813450 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-25qgn"] Oct 03 15:08:21 crc kubenswrapper[4774]: I1003 15:08:21.000793 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b9923a4-3d25-4762-9f54-c1afebdfb78a-catalog-content\") pod \"community-operators-25qgn\" (UID: \"2b9923a4-3d25-4762-9f54-c1afebdfb78a\") " pod="openshift-marketplace/community-operators-25qgn" Oct 03 15:08:21 crc kubenswrapper[4774]: I1003 15:08:21.001662 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b9923a4-3d25-4762-9f54-c1afebdfb78a-utilities\") pod \"community-operators-25qgn\" (UID: \"2b9923a4-3d25-4762-9f54-c1afebdfb78a\") " pod="openshift-marketplace/community-operators-25qgn" Oct 03 15:08:21 crc kubenswrapper[4774]: I1003 15:08:21.001770 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46dhw\" (UniqueName: \"kubernetes.io/projected/2b9923a4-3d25-4762-9f54-c1afebdfb78a-kube-api-access-46dhw\") pod \"community-operators-25qgn\" (UID: \"2b9923a4-3d25-4762-9f54-c1afebdfb78a\") " pod="openshift-marketplace/community-operators-25qgn" Oct 03 15:08:21 crc kubenswrapper[4774]: I1003 15:08:21.104076 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b9923a4-3d25-4762-9f54-c1afebdfb78a-catalog-content\") pod \"community-operators-25qgn\" (UID: \"2b9923a4-3d25-4762-9f54-c1afebdfb78a\") " pod="openshift-marketplace/community-operators-25qgn" Oct 03 15:08:21 crc kubenswrapper[4774]: I1003 15:08:21.104188 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b9923a4-3d25-4762-9f54-c1afebdfb78a-utilities\") pod \"community-operators-25qgn\" (UID: \"2b9923a4-3d25-4762-9f54-c1afebdfb78a\") " pod="openshift-marketplace/community-operators-25qgn" Oct 03 15:08:21 crc kubenswrapper[4774]: I1003 15:08:21.104229 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46dhw\" (UniqueName: \"kubernetes.io/projected/2b9923a4-3d25-4762-9f54-c1afebdfb78a-kube-api-access-46dhw\") pod \"community-operators-25qgn\" (UID: \"2b9923a4-3d25-4762-9f54-c1afebdfb78a\") " pod="openshift-marketplace/community-operators-25qgn" Oct 03 15:08:21 crc kubenswrapper[4774]: I1003 15:08:21.104846 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b9923a4-3d25-4762-9f54-c1afebdfb78a-catalog-content\") pod \"community-operators-25qgn\" (UID: \"2b9923a4-3d25-4762-9f54-c1afebdfb78a\") " pod="openshift-marketplace/community-operators-25qgn" Oct 03 15:08:21 crc kubenswrapper[4774]: I1003 15:08:21.104964 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b9923a4-3d25-4762-9f54-c1afebdfb78a-utilities\") pod \"community-operators-25qgn\" (UID: \"2b9923a4-3d25-4762-9f54-c1afebdfb78a\") " pod="openshift-marketplace/community-operators-25qgn" Oct 03 15:08:21 crc kubenswrapper[4774]: I1003 15:08:21.127564 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46dhw\" (UniqueName: \"kubernetes.io/projected/2b9923a4-3d25-4762-9f54-c1afebdfb78a-kube-api-access-46dhw\") pod \"community-operators-25qgn\" (UID: \"2b9923a4-3d25-4762-9f54-c1afebdfb78a\") " pod="openshift-marketplace/community-operators-25qgn" Oct 03 15:08:21 crc kubenswrapper[4774]: I1003 15:08:21.135564 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-25qgn" Oct 03 15:08:21 crc kubenswrapper[4774]: I1003 15:08:21.314698 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d7e75d8-aef6-48ba-8253-f107fe0c262b" path="/var/lib/kubelet/pods/0d7e75d8-aef6-48ba-8253-f107fe0c262b/volumes" Oct 03 15:08:21 crc kubenswrapper[4774]: I1003 15:08:21.646976 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-25qgn"] Oct 03 15:08:21 crc kubenswrapper[4774]: W1003 15:08:21.650943 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b9923a4_3d25_4762_9f54_c1afebdfb78a.slice/crio-8a35976a2a4dd5af215c8e2b0df6fac9613928aa10aff183db7bd17bcc1f417d WatchSource:0}: Error finding container 8a35976a2a4dd5af215c8e2b0df6fac9613928aa10aff183db7bd17bcc1f417d: Status 404 returned error can't find the container with id 8a35976a2a4dd5af215c8e2b0df6fac9613928aa10aff183db7bd17bcc1f417d Oct 03 15:08:22 crc kubenswrapper[4774]: I1003 15:08:22.635721 4774 generic.go:334] "Generic (PLEG): container finished" podID="2b9923a4-3d25-4762-9f54-c1afebdfb78a" containerID="ec2e4d9875b165a4cae43bcb1c003483b105e68e6acaa998958a847b08209564" exitCode=0 Oct 03 15:08:22 crc kubenswrapper[4774]: I1003 15:08:22.635805 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25qgn" event={"ID":"2b9923a4-3d25-4762-9f54-c1afebdfb78a","Type":"ContainerDied","Data":"ec2e4d9875b165a4cae43bcb1c003483b105e68e6acaa998958a847b08209564"} Oct 03 15:08:22 crc kubenswrapper[4774]: I1003 15:08:22.636081 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25qgn" event={"ID":"2b9923a4-3d25-4762-9f54-c1afebdfb78a","Type":"ContainerStarted","Data":"8a35976a2a4dd5af215c8e2b0df6fac9613928aa10aff183db7bd17bcc1f417d"} Oct 03 15:08:23 crc kubenswrapper[4774]: I1003 15:08:23.163830 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q7l4f" Oct 03 15:08:23 crc kubenswrapper[4774]: I1003 15:08:23.164468 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q7l4f" Oct 03 15:08:23 crc kubenswrapper[4774]: I1003 15:08:23.232513 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q7l4f" Oct 03 15:08:23 crc kubenswrapper[4774]: I1003 15:08:23.731694 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q7l4f" Oct 03 15:08:24 crc kubenswrapper[4774]: I1003 15:08:24.670730 4774 generic.go:334] "Generic (PLEG): container finished" podID="2b9923a4-3d25-4762-9f54-c1afebdfb78a" containerID="f7a76c3634497f1d0db7e1f9dfd4a8de16d354c96c294465a77f39f21f7ef9cc" exitCode=0 Oct 03 15:08:24 crc kubenswrapper[4774]: I1003 15:08:24.670819 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25qgn" event={"ID":"2b9923a4-3d25-4762-9f54-c1afebdfb78a","Type":"ContainerDied","Data":"f7a76c3634497f1d0db7e1f9dfd4a8de16d354c96c294465a77f39f21f7ef9cc"} Oct 03 15:08:25 crc kubenswrapper[4774]: I1003 15:08:25.684643 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25qgn" event={"ID":"2b9923a4-3d25-4762-9f54-c1afebdfb78a","Type":"ContainerStarted","Data":"1db96eb01e1b2e10d6635d1c82b36f4f233b848183929369eb389478eca1247a"} Oct 03 15:08:25 crc kubenswrapper[4774]: I1003 15:08:25.710713 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-25qgn" podStartSLOduration=3.002811451 podStartE2EDuration="5.710691695s" podCreationTimestamp="2025-10-03 15:08:20 +0000 UTC" firstStartedPulling="2025-10-03 15:08:22.636906683 +0000 UTC m=+1525.226110135" lastFinishedPulling="2025-10-03 15:08:25.344786917 +0000 UTC m=+1527.933990379" observedRunningTime="2025-10-03 15:08:25.707502086 +0000 UTC m=+1528.296705558" watchObservedRunningTime="2025-10-03 15:08:25.710691695 +0000 UTC m=+1528.299895157" Oct 03 15:08:26 crc kubenswrapper[4774]: I1003 15:08:26.566838 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vwptl" Oct 03 15:08:26 crc kubenswrapper[4774]: I1003 15:08:26.567160 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vwptl" Oct 03 15:08:26 crc kubenswrapper[4774]: I1003 15:08:26.592191 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q7l4f"] Oct 03 15:08:26 crc kubenswrapper[4774]: I1003 15:08:26.611944 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vwptl" Oct 03 15:08:26 crc kubenswrapper[4774]: I1003 15:08:26.693281 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-q7l4f" podUID="4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf" containerName="registry-server" containerID="cri-o://2a4bfc5e85754eead8634748a601025c5b263c7975b481e0ed94d7198740d67e" gracePeriod=2 Oct 03 15:08:26 crc kubenswrapper[4774]: I1003 15:08:26.743128 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vwptl" Oct 03 15:08:27 crc kubenswrapper[4774]: I1003 15:08:27.140004 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q7l4f" Oct 03 15:08:27 crc kubenswrapper[4774]: I1003 15:08:27.247073 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf-utilities\") pod \"4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf\" (UID: \"4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf\") " Oct 03 15:08:27 crc kubenswrapper[4774]: I1003 15:08:27.247166 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf-catalog-content\") pod \"4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf\" (UID: \"4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf\") " Oct 03 15:08:27 crc kubenswrapper[4774]: I1003 15:08:27.247302 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx46v\" (UniqueName: \"kubernetes.io/projected/4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf-kube-api-access-rx46v\") pod \"4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf\" (UID: \"4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf\") " Oct 03 15:08:27 crc kubenswrapper[4774]: I1003 15:08:27.248202 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf-utilities" (OuterVolumeSpecName: "utilities") pod "4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf" (UID: "4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:08:27 crc kubenswrapper[4774]: I1003 15:08:27.252733 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf-kube-api-access-rx46v" (OuterVolumeSpecName: "kube-api-access-rx46v") pod "4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf" (UID: "4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf"). InnerVolumeSpecName "kube-api-access-rx46v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:08:27 crc kubenswrapper[4774]: I1003 15:08:27.264333 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf" (UID: "4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:08:27 crc kubenswrapper[4774]: I1003 15:08:27.349847 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 15:08:27 crc kubenswrapper[4774]: I1003 15:08:27.349875 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 15:08:27 crc kubenswrapper[4774]: I1003 15:08:27.349908 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx46v\" (UniqueName: \"kubernetes.io/projected/4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf-kube-api-access-rx46v\") on node \"crc\" DevicePath \"\"" Oct 03 15:08:27 crc kubenswrapper[4774]: I1003 15:08:27.705608 4774 generic.go:334] "Generic (PLEG): container finished" podID="4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf" containerID="2a4bfc5e85754eead8634748a601025c5b263c7975b481e0ed94d7198740d67e" exitCode=0 Oct 03 15:08:27 crc kubenswrapper[4774]: I1003 15:08:27.705670 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q7l4f" Oct 03 15:08:27 crc kubenswrapper[4774]: I1003 15:08:27.705733 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q7l4f" event={"ID":"4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf","Type":"ContainerDied","Data":"2a4bfc5e85754eead8634748a601025c5b263c7975b481e0ed94d7198740d67e"} Oct 03 15:08:27 crc kubenswrapper[4774]: I1003 15:08:27.705812 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q7l4f" event={"ID":"4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf","Type":"ContainerDied","Data":"96666e7464d3f4b3b3168bb1d3ef662b783571be44c3a3fcd832b44bb322992d"} Oct 03 15:08:27 crc kubenswrapper[4774]: I1003 15:08:27.705846 4774 scope.go:117] "RemoveContainer" containerID="2a4bfc5e85754eead8634748a601025c5b263c7975b481e0ed94d7198740d67e" Oct 03 15:08:27 crc kubenswrapper[4774]: I1003 15:08:27.739668 4774 scope.go:117] "RemoveContainer" containerID="fb75149ce2a8ebf389a3121b8b0f1e929ba535770d73341bbb1a5c8419f87ca0" Oct 03 15:08:27 crc kubenswrapper[4774]: I1003 15:08:27.744811 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q7l4f"] Oct 03 15:08:27 crc kubenswrapper[4774]: I1003 15:08:27.753471 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-q7l4f"] Oct 03 15:08:27 crc kubenswrapper[4774]: I1003 15:08:27.771885 4774 scope.go:117] "RemoveContainer" containerID="9181351fbcbc138b64aac0c8050295991c249cd12ebbdd26ff2c217ba63f1574" Oct 03 15:08:27 crc kubenswrapper[4774]: I1003 15:08:27.838181 4774 scope.go:117] "RemoveContainer" containerID="2a4bfc5e85754eead8634748a601025c5b263c7975b481e0ed94d7198740d67e" Oct 03 15:08:27 crc kubenswrapper[4774]: E1003 15:08:27.839134 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a4bfc5e85754eead8634748a601025c5b263c7975b481e0ed94d7198740d67e\": container with ID starting with 2a4bfc5e85754eead8634748a601025c5b263c7975b481e0ed94d7198740d67e not found: ID does not exist" containerID="2a4bfc5e85754eead8634748a601025c5b263c7975b481e0ed94d7198740d67e" Oct 03 15:08:27 crc kubenswrapper[4774]: I1003 15:08:27.839173 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a4bfc5e85754eead8634748a601025c5b263c7975b481e0ed94d7198740d67e"} err="failed to get container status \"2a4bfc5e85754eead8634748a601025c5b263c7975b481e0ed94d7198740d67e\": rpc error: code = NotFound desc = could not find container \"2a4bfc5e85754eead8634748a601025c5b263c7975b481e0ed94d7198740d67e\": container with ID starting with 2a4bfc5e85754eead8634748a601025c5b263c7975b481e0ed94d7198740d67e not found: ID does not exist" Oct 03 15:08:27 crc kubenswrapper[4774]: I1003 15:08:27.839198 4774 scope.go:117] "RemoveContainer" containerID="fb75149ce2a8ebf389a3121b8b0f1e929ba535770d73341bbb1a5c8419f87ca0" Oct 03 15:08:27 crc kubenswrapper[4774]: E1003 15:08:27.839642 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb75149ce2a8ebf389a3121b8b0f1e929ba535770d73341bbb1a5c8419f87ca0\": container with ID starting with fb75149ce2a8ebf389a3121b8b0f1e929ba535770d73341bbb1a5c8419f87ca0 not found: ID does not exist" containerID="fb75149ce2a8ebf389a3121b8b0f1e929ba535770d73341bbb1a5c8419f87ca0" Oct 03 15:08:27 crc kubenswrapper[4774]: I1003 15:08:27.839690 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb75149ce2a8ebf389a3121b8b0f1e929ba535770d73341bbb1a5c8419f87ca0"} err="failed to get container status \"fb75149ce2a8ebf389a3121b8b0f1e929ba535770d73341bbb1a5c8419f87ca0\": rpc error: code = NotFound desc = could not find container \"fb75149ce2a8ebf389a3121b8b0f1e929ba535770d73341bbb1a5c8419f87ca0\": container with ID starting with fb75149ce2a8ebf389a3121b8b0f1e929ba535770d73341bbb1a5c8419f87ca0 not found: ID does not exist" Oct 03 15:08:27 crc kubenswrapper[4774]: I1003 15:08:27.839710 4774 scope.go:117] "RemoveContainer" containerID="9181351fbcbc138b64aac0c8050295991c249cd12ebbdd26ff2c217ba63f1574" Oct 03 15:08:27 crc kubenswrapper[4774]: E1003 15:08:27.840123 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9181351fbcbc138b64aac0c8050295991c249cd12ebbdd26ff2c217ba63f1574\": container with ID starting with 9181351fbcbc138b64aac0c8050295991c249cd12ebbdd26ff2c217ba63f1574 not found: ID does not exist" containerID="9181351fbcbc138b64aac0c8050295991c249cd12ebbdd26ff2c217ba63f1574" Oct 03 15:08:27 crc kubenswrapper[4774]: I1003 15:08:27.840148 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9181351fbcbc138b64aac0c8050295991c249cd12ebbdd26ff2c217ba63f1574"} err="failed to get container status \"9181351fbcbc138b64aac0c8050295991c249cd12ebbdd26ff2c217ba63f1574\": rpc error: code = NotFound desc = could not find container \"9181351fbcbc138b64aac0c8050295991c249cd12ebbdd26ff2c217ba63f1574\": container with ID starting with 9181351fbcbc138b64aac0c8050295991c249cd12ebbdd26ff2c217ba63f1574 not found: ID does not exist" Oct 03 15:08:29 crc kubenswrapper[4774]: I1003 15:08:29.314675 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf" path="/var/lib/kubelet/pods/4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf/volumes" Oct 03 15:08:31 crc kubenswrapper[4774]: I1003 15:08:30.999572 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vwptl"] Oct 03 15:08:31 crc kubenswrapper[4774]: I1003 15:08:31.000253 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vwptl" podUID="1b2b2241-4922-45e7-bfce-72e4ce7076c7" containerName="registry-server" containerID="cri-o://396b8ed2dde420d3b11be2691f5adbad1c3fc8126e9e02ef810f6dd1d834f4fd" gracePeriod=2 Oct 03 15:08:31 crc kubenswrapper[4774]: I1003 15:08:31.136714 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-25qgn" Oct 03 15:08:31 crc kubenswrapper[4774]: I1003 15:08:31.137119 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-25qgn" Oct 03 15:08:31 crc kubenswrapper[4774]: I1003 15:08:31.210613 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-25qgn" Oct 03 15:08:31 crc kubenswrapper[4774]: I1003 15:08:31.811130 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-25qgn" Oct 03 15:08:32 crc kubenswrapper[4774]: I1003 15:08:32.796219 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vwptl" Oct 03 15:08:32 crc kubenswrapper[4774]: I1003 15:08:32.802798 4774 generic.go:334] "Generic (PLEG): container finished" podID="1b2b2241-4922-45e7-bfce-72e4ce7076c7" containerID="396b8ed2dde420d3b11be2691f5adbad1c3fc8126e9e02ef810f6dd1d834f4fd" exitCode=0 Oct 03 15:08:32 crc kubenswrapper[4774]: I1003 15:08:32.803895 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vwptl" Oct 03 15:08:32 crc kubenswrapper[4774]: I1003 15:08:32.804104 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwptl" event={"ID":"1b2b2241-4922-45e7-bfce-72e4ce7076c7","Type":"ContainerDied","Data":"396b8ed2dde420d3b11be2691f5adbad1c3fc8126e9e02ef810f6dd1d834f4fd"} Oct 03 15:08:32 crc kubenswrapper[4774]: I1003 15:08:32.804143 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwptl" event={"ID":"1b2b2241-4922-45e7-bfce-72e4ce7076c7","Type":"ContainerDied","Data":"07c52ccacc78ec27b40e81485f322210d06d5b9704d12bb5bd0404e30f2cdc51"} Oct 03 15:08:32 crc kubenswrapper[4774]: I1003 15:08:32.804164 4774 scope.go:117] "RemoveContainer" containerID="396b8ed2dde420d3b11be2691f5adbad1c3fc8126e9e02ef810f6dd1d834f4fd" Oct 03 15:08:32 crc kubenswrapper[4774]: I1003 15:08:32.833846 4774 scope.go:117] "RemoveContainer" containerID="7353df0ee47bcfa7ef79e324d06b4887dc7255fa7587103be2c1f26dc4303ac9" Oct 03 15:08:32 crc kubenswrapper[4774]: I1003 15:08:32.861289 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b2b2241-4922-45e7-bfce-72e4ce7076c7-catalog-content\") pod \"1b2b2241-4922-45e7-bfce-72e4ce7076c7\" (UID: \"1b2b2241-4922-45e7-bfce-72e4ce7076c7\") " Oct 03 15:08:32 crc kubenswrapper[4774]: I1003 15:08:32.861435 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cf2z\" (UniqueName: \"kubernetes.io/projected/1b2b2241-4922-45e7-bfce-72e4ce7076c7-kube-api-access-4cf2z\") pod \"1b2b2241-4922-45e7-bfce-72e4ce7076c7\" (UID: \"1b2b2241-4922-45e7-bfce-72e4ce7076c7\") " Oct 03 15:08:32 crc kubenswrapper[4774]: I1003 15:08:32.861553 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b2b2241-4922-45e7-bfce-72e4ce7076c7-utilities\") pod \"1b2b2241-4922-45e7-bfce-72e4ce7076c7\" (UID: \"1b2b2241-4922-45e7-bfce-72e4ce7076c7\") " Oct 03 15:08:32 crc kubenswrapper[4774]: I1003 15:08:32.862767 4774 scope.go:117] "RemoveContainer" containerID="03c2f2870ca3db252b3ba9c3153bd35995cb801cbee2f93d053ce2bc5e55cf8f" Oct 03 15:08:32 crc kubenswrapper[4774]: I1003 15:08:32.863055 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b2b2241-4922-45e7-bfce-72e4ce7076c7-utilities" (OuterVolumeSpecName: "utilities") pod "1b2b2241-4922-45e7-bfce-72e4ce7076c7" (UID: "1b2b2241-4922-45e7-bfce-72e4ce7076c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:08:32 crc kubenswrapper[4774]: I1003 15:08:32.871745 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b2b2241-4922-45e7-bfce-72e4ce7076c7-kube-api-access-4cf2z" (OuterVolumeSpecName: "kube-api-access-4cf2z") pod "1b2b2241-4922-45e7-bfce-72e4ce7076c7" (UID: "1b2b2241-4922-45e7-bfce-72e4ce7076c7"). InnerVolumeSpecName "kube-api-access-4cf2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:08:32 crc kubenswrapper[4774]: I1003 15:08:32.925428 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b2b2241-4922-45e7-bfce-72e4ce7076c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1b2b2241-4922-45e7-bfce-72e4ce7076c7" (UID: "1b2b2241-4922-45e7-bfce-72e4ce7076c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:08:32 crc kubenswrapper[4774]: I1003 15:08:32.930526 4774 scope.go:117] "RemoveContainer" containerID="396b8ed2dde420d3b11be2691f5adbad1c3fc8126e9e02ef810f6dd1d834f4fd" Oct 03 15:08:32 crc kubenswrapper[4774]: E1003 15:08:32.937017 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"396b8ed2dde420d3b11be2691f5adbad1c3fc8126e9e02ef810f6dd1d834f4fd\": container with ID starting with 396b8ed2dde420d3b11be2691f5adbad1c3fc8126e9e02ef810f6dd1d834f4fd not found: ID does not exist" containerID="396b8ed2dde420d3b11be2691f5adbad1c3fc8126e9e02ef810f6dd1d834f4fd" Oct 03 15:08:32 crc kubenswrapper[4774]: I1003 15:08:32.937084 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"396b8ed2dde420d3b11be2691f5adbad1c3fc8126e9e02ef810f6dd1d834f4fd"} err="failed to get container status \"396b8ed2dde420d3b11be2691f5adbad1c3fc8126e9e02ef810f6dd1d834f4fd\": rpc error: code = NotFound desc = could not find container \"396b8ed2dde420d3b11be2691f5adbad1c3fc8126e9e02ef810f6dd1d834f4fd\": container with ID starting with 396b8ed2dde420d3b11be2691f5adbad1c3fc8126e9e02ef810f6dd1d834f4fd not found: ID does not exist" Oct 03 15:08:32 crc kubenswrapper[4774]: I1003 15:08:32.937116 4774 scope.go:117] "RemoveContainer" containerID="7353df0ee47bcfa7ef79e324d06b4887dc7255fa7587103be2c1f26dc4303ac9" Oct 03 15:08:32 crc kubenswrapper[4774]: E1003 15:08:32.937725 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7353df0ee47bcfa7ef79e324d06b4887dc7255fa7587103be2c1f26dc4303ac9\": container with ID starting with 7353df0ee47bcfa7ef79e324d06b4887dc7255fa7587103be2c1f26dc4303ac9 not found: ID does not exist" containerID="7353df0ee47bcfa7ef79e324d06b4887dc7255fa7587103be2c1f26dc4303ac9" Oct 03 15:08:32 crc kubenswrapper[4774]: I1003 15:08:32.937910 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7353df0ee47bcfa7ef79e324d06b4887dc7255fa7587103be2c1f26dc4303ac9"} err="failed to get container status \"7353df0ee47bcfa7ef79e324d06b4887dc7255fa7587103be2c1f26dc4303ac9\": rpc error: code = NotFound desc = could not find container \"7353df0ee47bcfa7ef79e324d06b4887dc7255fa7587103be2c1f26dc4303ac9\": container with ID starting with 7353df0ee47bcfa7ef79e324d06b4887dc7255fa7587103be2c1f26dc4303ac9 not found: ID does not exist" Oct 03 15:08:32 crc kubenswrapper[4774]: I1003 15:08:32.938063 4774 scope.go:117] "RemoveContainer" containerID="03c2f2870ca3db252b3ba9c3153bd35995cb801cbee2f93d053ce2bc5e55cf8f" Oct 03 15:08:32 crc kubenswrapper[4774]: E1003 15:08:32.938688 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03c2f2870ca3db252b3ba9c3153bd35995cb801cbee2f93d053ce2bc5e55cf8f\": container with ID starting with 03c2f2870ca3db252b3ba9c3153bd35995cb801cbee2f93d053ce2bc5e55cf8f not found: ID does not exist" containerID="03c2f2870ca3db252b3ba9c3153bd35995cb801cbee2f93d053ce2bc5e55cf8f" Oct 03 15:08:32 crc kubenswrapper[4774]: I1003 15:08:32.938724 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03c2f2870ca3db252b3ba9c3153bd35995cb801cbee2f93d053ce2bc5e55cf8f"} err="failed to get container status \"03c2f2870ca3db252b3ba9c3153bd35995cb801cbee2f93d053ce2bc5e55cf8f\": rpc error: code = NotFound desc = could not find container \"03c2f2870ca3db252b3ba9c3153bd35995cb801cbee2f93d053ce2bc5e55cf8f\": container with ID starting with 03c2f2870ca3db252b3ba9c3153bd35995cb801cbee2f93d053ce2bc5e55cf8f not found: ID does not exist" Oct 03 15:08:32 crc kubenswrapper[4774]: I1003 15:08:32.963271 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b2b2241-4922-45e7-bfce-72e4ce7076c7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 15:08:32 crc kubenswrapper[4774]: I1003 15:08:32.963298 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cf2z\" (UniqueName: \"kubernetes.io/projected/1b2b2241-4922-45e7-bfce-72e4ce7076c7-kube-api-access-4cf2z\") on node \"crc\" DevicePath \"\"" Oct 03 15:08:32 crc kubenswrapper[4774]: I1003 15:08:32.963307 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b2b2241-4922-45e7-bfce-72e4ce7076c7-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 15:08:33 crc kubenswrapper[4774]: I1003 15:08:33.143682 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vwptl"] Oct 03 15:08:33 crc kubenswrapper[4774]: I1003 15:08:33.153766 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vwptl"] Oct 03 15:08:33 crc kubenswrapper[4774]: I1003 15:08:33.312083 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b2b2241-4922-45e7-bfce-72e4ce7076c7" path="/var/lib/kubelet/pods/1b2b2241-4922-45e7-bfce-72e4ce7076c7/volumes" Oct 03 15:08:33 crc kubenswrapper[4774]: I1003 15:08:33.595521 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-25qgn"] Oct 03 15:08:33 crc kubenswrapper[4774]: I1003 15:08:33.813127 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-25qgn" podUID="2b9923a4-3d25-4762-9f54-c1afebdfb78a" containerName="registry-server" containerID="cri-o://1db96eb01e1b2e10d6635d1c82b36f4f233b848183929369eb389478eca1247a" gracePeriod=2 Oct 03 15:08:34 crc kubenswrapper[4774]: I1003 15:08:34.748721 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-25qgn" Oct 03 15:08:34 crc kubenswrapper[4774]: I1003 15:08:34.801328 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b9923a4-3d25-4762-9f54-c1afebdfb78a-utilities\") pod \"2b9923a4-3d25-4762-9f54-c1afebdfb78a\" (UID: \"2b9923a4-3d25-4762-9f54-c1afebdfb78a\") " Oct 03 15:08:34 crc kubenswrapper[4774]: I1003 15:08:34.801423 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46dhw\" (UniqueName: \"kubernetes.io/projected/2b9923a4-3d25-4762-9f54-c1afebdfb78a-kube-api-access-46dhw\") pod \"2b9923a4-3d25-4762-9f54-c1afebdfb78a\" (UID: \"2b9923a4-3d25-4762-9f54-c1afebdfb78a\") " Oct 03 15:08:34 crc kubenswrapper[4774]: I1003 15:08:34.801671 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b9923a4-3d25-4762-9f54-c1afebdfb78a-catalog-content\") pod \"2b9923a4-3d25-4762-9f54-c1afebdfb78a\" (UID: \"2b9923a4-3d25-4762-9f54-c1afebdfb78a\") " Oct 03 15:08:34 crc kubenswrapper[4774]: I1003 15:08:34.802796 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b9923a4-3d25-4762-9f54-c1afebdfb78a-utilities" (OuterVolumeSpecName: "utilities") pod "2b9923a4-3d25-4762-9f54-c1afebdfb78a" (UID: "2b9923a4-3d25-4762-9f54-c1afebdfb78a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:08:34 crc kubenswrapper[4774]: I1003 15:08:34.810825 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b9923a4-3d25-4762-9f54-c1afebdfb78a-kube-api-access-46dhw" (OuterVolumeSpecName: "kube-api-access-46dhw") pod "2b9923a4-3d25-4762-9f54-c1afebdfb78a" (UID: "2b9923a4-3d25-4762-9f54-c1afebdfb78a"). InnerVolumeSpecName "kube-api-access-46dhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:08:34 crc kubenswrapper[4774]: I1003 15:08:34.822580 4774 generic.go:334] "Generic (PLEG): container finished" podID="2b9923a4-3d25-4762-9f54-c1afebdfb78a" containerID="1db96eb01e1b2e10d6635d1c82b36f4f233b848183929369eb389478eca1247a" exitCode=0 Oct 03 15:08:34 crc kubenswrapper[4774]: I1003 15:08:34.822622 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25qgn" event={"ID":"2b9923a4-3d25-4762-9f54-c1afebdfb78a","Type":"ContainerDied","Data":"1db96eb01e1b2e10d6635d1c82b36f4f233b848183929369eb389478eca1247a"} Oct 03 15:08:34 crc kubenswrapper[4774]: I1003 15:08:34.822639 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-25qgn" Oct 03 15:08:34 crc kubenswrapper[4774]: I1003 15:08:34.822660 4774 scope.go:117] "RemoveContainer" containerID="1db96eb01e1b2e10d6635d1c82b36f4f233b848183929369eb389478eca1247a" Oct 03 15:08:34 crc kubenswrapper[4774]: I1003 15:08:34.822648 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25qgn" event={"ID":"2b9923a4-3d25-4762-9f54-c1afebdfb78a","Type":"ContainerDied","Data":"8a35976a2a4dd5af215c8e2b0df6fac9613928aa10aff183db7bd17bcc1f417d"} Oct 03 15:08:34 crc kubenswrapper[4774]: I1003 15:08:34.864243 4774 scope.go:117] "RemoveContainer" containerID="f7a76c3634497f1d0db7e1f9dfd4a8de16d354c96c294465a77f39f21f7ef9cc" Oct 03 15:08:34 crc kubenswrapper[4774]: I1003 15:08:34.870088 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b9923a4-3d25-4762-9f54-c1afebdfb78a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b9923a4-3d25-4762-9f54-c1afebdfb78a" (UID: "2b9923a4-3d25-4762-9f54-c1afebdfb78a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:08:34 crc kubenswrapper[4774]: I1003 15:08:34.885230 4774 scope.go:117] "RemoveContainer" containerID="ec2e4d9875b165a4cae43bcb1c003483b105e68e6acaa998958a847b08209564" Oct 03 15:08:34 crc kubenswrapper[4774]: I1003 15:08:34.903922 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b9923a4-3d25-4762-9f54-c1afebdfb78a-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 15:08:34 crc kubenswrapper[4774]: I1003 15:08:34.903962 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46dhw\" (UniqueName: \"kubernetes.io/projected/2b9923a4-3d25-4762-9f54-c1afebdfb78a-kube-api-access-46dhw\") on node \"crc\" DevicePath \"\"" Oct 03 15:08:34 crc kubenswrapper[4774]: I1003 15:08:34.904015 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b9923a4-3d25-4762-9f54-c1afebdfb78a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 15:08:34 crc kubenswrapper[4774]: I1003 15:08:34.929627 4774 scope.go:117] "RemoveContainer" containerID="1db96eb01e1b2e10d6635d1c82b36f4f233b848183929369eb389478eca1247a" Oct 03 15:08:34 crc kubenswrapper[4774]: E1003 15:08:34.930087 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1db96eb01e1b2e10d6635d1c82b36f4f233b848183929369eb389478eca1247a\": container with ID starting with 1db96eb01e1b2e10d6635d1c82b36f4f233b848183929369eb389478eca1247a not found: ID does not exist" containerID="1db96eb01e1b2e10d6635d1c82b36f4f233b848183929369eb389478eca1247a" Oct 03 15:08:34 crc kubenswrapper[4774]: I1003 15:08:34.930147 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1db96eb01e1b2e10d6635d1c82b36f4f233b848183929369eb389478eca1247a"} err="failed to get container status \"1db96eb01e1b2e10d6635d1c82b36f4f233b848183929369eb389478eca1247a\": rpc error: code = NotFound desc = could not find container \"1db96eb01e1b2e10d6635d1c82b36f4f233b848183929369eb389478eca1247a\": container with ID starting with 1db96eb01e1b2e10d6635d1c82b36f4f233b848183929369eb389478eca1247a not found: ID does not exist" Oct 03 15:08:34 crc kubenswrapper[4774]: I1003 15:08:34.930191 4774 scope.go:117] "RemoveContainer" containerID="f7a76c3634497f1d0db7e1f9dfd4a8de16d354c96c294465a77f39f21f7ef9cc" Oct 03 15:08:34 crc kubenswrapper[4774]: E1003 15:08:34.930603 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7a76c3634497f1d0db7e1f9dfd4a8de16d354c96c294465a77f39f21f7ef9cc\": container with ID starting with f7a76c3634497f1d0db7e1f9dfd4a8de16d354c96c294465a77f39f21f7ef9cc not found: ID does not exist" containerID="f7a76c3634497f1d0db7e1f9dfd4a8de16d354c96c294465a77f39f21f7ef9cc" Oct 03 15:08:34 crc kubenswrapper[4774]: I1003 15:08:34.930648 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7a76c3634497f1d0db7e1f9dfd4a8de16d354c96c294465a77f39f21f7ef9cc"} err="failed to get container status \"f7a76c3634497f1d0db7e1f9dfd4a8de16d354c96c294465a77f39f21f7ef9cc\": rpc error: code = NotFound desc = could not find container \"f7a76c3634497f1d0db7e1f9dfd4a8de16d354c96c294465a77f39f21f7ef9cc\": container with ID starting with f7a76c3634497f1d0db7e1f9dfd4a8de16d354c96c294465a77f39f21f7ef9cc not found: ID does not exist" Oct 03 15:08:34 crc kubenswrapper[4774]: I1003 15:08:34.930676 4774 scope.go:117] "RemoveContainer" containerID="ec2e4d9875b165a4cae43bcb1c003483b105e68e6acaa998958a847b08209564" Oct 03 15:08:34 crc kubenswrapper[4774]: E1003 15:08:34.930946 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec2e4d9875b165a4cae43bcb1c003483b105e68e6acaa998958a847b08209564\": container with ID starting with ec2e4d9875b165a4cae43bcb1c003483b105e68e6acaa998958a847b08209564 not found: ID does not exist" containerID="ec2e4d9875b165a4cae43bcb1c003483b105e68e6acaa998958a847b08209564" Oct 03 15:08:34 crc kubenswrapper[4774]: I1003 15:08:34.930985 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec2e4d9875b165a4cae43bcb1c003483b105e68e6acaa998958a847b08209564"} err="failed to get container status \"ec2e4d9875b165a4cae43bcb1c003483b105e68e6acaa998958a847b08209564\": rpc error: code = NotFound desc = could not find container \"ec2e4d9875b165a4cae43bcb1c003483b105e68e6acaa998958a847b08209564\": container with ID starting with ec2e4d9875b165a4cae43bcb1c003483b105e68e6acaa998958a847b08209564 not found: ID does not exist" Oct 03 15:08:35 crc kubenswrapper[4774]: I1003 15:08:35.178954 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-25qgn"] Oct 03 15:08:35 crc kubenswrapper[4774]: I1003 15:08:35.193412 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-25qgn"] Oct 03 15:08:35 crc kubenswrapper[4774]: I1003 15:08:35.318584 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b9923a4-3d25-4762-9f54-c1afebdfb78a" path="/var/lib/kubelet/pods/2b9923a4-3d25-4762-9f54-c1afebdfb78a/volumes" Oct 03 15:08:50 crc kubenswrapper[4774]: I1003 15:08:50.653491 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:08:50 crc kubenswrapper[4774]: I1003 15:08:50.654126 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:09:14 crc kubenswrapper[4774]: I1003 15:09:14.293113 4774 generic.go:334] "Generic (PLEG): container finished" podID="d70dce54-aa22-4af1-a341-4ff90ba78722" containerID="16b3dd861d084f153ba8b566db0be66bcc2530c636d4921cc8d373dcd2babe92" exitCode=0 Oct 03 15:09:14 crc kubenswrapper[4774]: I1003 15:09:14.293244 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb" event={"ID":"d70dce54-aa22-4af1-a341-4ff90ba78722","Type":"ContainerDied","Data":"16b3dd861d084f153ba8b566db0be66bcc2530c636d4921cc8d373dcd2babe92"} Oct 03 15:09:15 crc kubenswrapper[4774]: I1003 15:09:15.725621 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb" Oct 03 15:09:15 crc kubenswrapper[4774]: I1003 15:09:15.828364 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5ftm\" (UniqueName: \"kubernetes.io/projected/d70dce54-aa22-4af1-a341-4ff90ba78722-kube-api-access-g5ftm\") pod \"d70dce54-aa22-4af1-a341-4ff90ba78722\" (UID: \"d70dce54-aa22-4af1-a341-4ff90ba78722\") " Oct 03 15:09:15 crc kubenswrapper[4774]: I1003 15:09:15.828693 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d70dce54-aa22-4af1-a341-4ff90ba78722-inventory\") pod \"d70dce54-aa22-4af1-a341-4ff90ba78722\" (UID: \"d70dce54-aa22-4af1-a341-4ff90ba78722\") " Oct 03 15:09:15 crc kubenswrapper[4774]: I1003 15:09:15.828864 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d70dce54-aa22-4af1-a341-4ff90ba78722-ssh-key\") pod \"d70dce54-aa22-4af1-a341-4ff90ba78722\" (UID: \"d70dce54-aa22-4af1-a341-4ff90ba78722\") " Oct 03 15:09:15 crc kubenswrapper[4774]: I1003 15:09:15.829355 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d70dce54-aa22-4af1-a341-4ff90ba78722-bootstrap-combined-ca-bundle\") pod \"d70dce54-aa22-4af1-a341-4ff90ba78722\" (UID: \"d70dce54-aa22-4af1-a341-4ff90ba78722\") " Oct 03 15:09:15 crc kubenswrapper[4774]: I1003 15:09:15.834345 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d70dce54-aa22-4af1-a341-4ff90ba78722-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "d70dce54-aa22-4af1-a341-4ff90ba78722" (UID: "d70dce54-aa22-4af1-a341-4ff90ba78722"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:09:15 crc kubenswrapper[4774]: I1003 15:09:15.834403 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d70dce54-aa22-4af1-a341-4ff90ba78722-kube-api-access-g5ftm" (OuterVolumeSpecName: "kube-api-access-g5ftm") pod "d70dce54-aa22-4af1-a341-4ff90ba78722" (UID: "d70dce54-aa22-4af1-a341-4ff90ba78722"). InnerVolumeSpecName "kube-api-access-g5ftm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:09:15 crc kubenswrapper[4774]: I1003 15:09:15.876738 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d70dce54-aa22-4af1-a341-4ff90ba78722-inventory" (OuterVolumeSpecName: "inventory") pod "d70dce54-aa22-4af1-a341-4ff90ba78722" (UID: "d70dce54-aa22-4af1-a341-4ff90ba78722"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:09:15 crc kubenswrapper[4774]: I1003 15:09:15.886846 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d70dce54-aa22-4af1-a341-4ff90ba78722-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d70dce54-aa22-4af1-a341-4ff90ba78722" (UID: "d70dce54-aa22-4af1-a341-4ff90ba78722"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:09:15 crc kubenswrapper[4774]: I1003 15:09:15.931036 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5ftm\" (UniqueName: \"kubernetes.io/projected/d70dce54-aa22-4af1-a341-4ff90ba78722-kube-api-access-g5ftm\") on node \"crc\" DevicePath \"\"" Oct 03 15:09:15 crc kubenswrapper[4774]: I1003 15:09:15.931070 4774 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d70dce54-aa22-4af1-a341-4ff90ba78722-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 15:09:15 crc kubenswrapper[4774]: I1003 15:09:15.931083 4774 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d70dce54-aa22-4af1-a341-4ff90ba78722-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 15:09:15 crc kubenswrapper[4774]: I1003 15:09:15.931092 4774 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d70dce54-aa22-4af1-a341-4ff90ba78722-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:09:16 crc kubenswrapper[4774]: I1003 15:09:16.312575 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb" event={"ID":"d70dce54-aa22-4af1-a341-4ff90ba78722","Type":"ContainerDied","Data":"c7eae101dbe0a6da1835da0ce05097ab0a807df3cb4b133d297cad9690661119"} Oct 03 15:09:16 crc kubenswrapper[4774]: I1003 15:09:16.312619 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb" Oct 03 15:09:16 crc kubenswrapper[4774]: I1003 15:09:16.312631 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7eae101dbe0a6da1835da0ce05097ab0a807df3cb4b133d297cad9690661119" Oct 03 15:09:16 crc kubenswrapper[4774]: I1003 15:09:16.410913 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bh568"] Oct 03 15:09:16 crc kubenswrapper[4774]: E1003 15:09:16.411291 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d70dce54-aa22-4af1-a341-4ff90ba78722" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 03 15:09:16 crc kubenswrapper[4774]: I1003 15:09:16.411306 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="d70dce54-aa22-4af1-a341-4ff90ba78722" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 03 15:09:16 crc kubenswrapper[4774]: E1003 15:09:16.411321 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf" containerName="extract-content" Oct 03 15:09:16 crc kubenswrapper[4774]: I1003 15:09:16.411328 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf" containerName="extract-content" Oct 03 15:09:16 crc kubenswrapper[4774]: E1003 15:09:16.411337 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b9923a4-3d25-4762-9f54-c1afebdfb78a" containerName="extract-utilities" Oct 03 15:09:16 crc kubenswrapper[4774]: I1003 15:09:16.411343 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b9923a4-3d25-4762-9f54-c1afebdfb78a" containerName="extract-utilities" Oct 03 15:09:16 crc kubenswrapper[4774]: E1003 15:09:16.411357 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b2b2241-4922-45e7-bfce-72e4ce7076c7" containerName="extract-content" Oct 03 15:09:16 crc kubenswrapper[4774]: I1003 15:09:16.411364 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b2b2241-4922-45e7-bfce-72e4ce7076c7" containerName="extract-content" Oct 03 15:09:16 crc kubenswrapper[4774]: E1003 15:09:16.411388 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf" containerName="extract-utilities" Oct 03 15:09:16 crc kubenswrapper[4774]: I1003 15:09:16.411396 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf" containerName="extract-utilities" Oct 03 15:09:16 crc kubenswrapper[4774]: E1003 15:09:16.411410 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b9923a4-3d25-4762-9f54-c1afebdfb78a" containerName="extract-content" Oct 03 15:09:16 crc kubenswrapper[4774]: I1003 15:09:16.411416 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b9923a4-3d25-4762-9f54-c1afebdfb78a" containerName="extract-content" Oct 03 15:09:16 crc kubenswrapper[4774]: E1003 15:09:16.411424 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf" containerName="registry-server" Oct 03 15:09:16 crc kubenswrapper[4774]: I1003 15:09:16.411430 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf" containerName="registry-server" Oct 03 15:09:16 crc kubenswrapper[4774]: E1003 15:09:16.411436 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b2b2241-4922-45e7-bfce-72e4ce7076c7" containerName="extract-utilities" Oct 03 15:09:16 crc kubenswrapper[4774]: I1003 15:09:16.411442 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b2b2241-4922-45e7-bfce-72e4ce7076c7" containerName="extract-utilities" Oct 03 15:09:16 crc kubenswrapper[4774]: E1003 15:09:16.411452 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b2b2241-4922-45e7-bfce-72e4ce7076c7" containerName="registry-server" Oct 03 15:09:16 crc kubenswrapper[4774]: I1003 15:09:16.411457 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b2b2241-4922-45e7-bfce-72e4ce7076c7" containerName="registry-server" Oct 03 15:09:16 crc kubenswrapper[4774]: E1003 15:09:16.411473 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b9923a4-3d25-4762-9f54-c1afebdfb78a" containerName="registry-server" Oct 03 15:09:16 crc kubenswrapper[4774]: I1003 15:09:16.411478 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b9923a4-3d25-4762-9f54-c1afebdfb78a" containerName="registry-server" Oct 03 15:09:16 crc kubenswrapper[4774]: I1003 15:09:16.411654 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b2b2241-4922-45e7-bfce-72e4ce7076c7" containerName="registry-server" Oct 03 15:09:16 crc kubenswrapper[4774]: I1003 15:09:16.411665 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="d70dce54-aa22-4af1-a341-4ff90ba78722" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 03 15:09:16 crc kubenswrapper[4774]: I1003 15:09:16.411678 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bd8ef87-b6a6-4cf9-9c92-d60da2bea3bf" containerName="registry-server" Oct 03 15:09:16 crc kubenswrapper[4774]: I1003 15:09:16.411690 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b9923a4-3d25-4762-9f54-c1afebdfb78a" containerName="registry-server" Oct 03 15:09:16 crc kubenswrapper[4774]: I1003 15:09:16.412269 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bh568" Oct 03 15:09:16 crc kubenswrapper[4774]: I1003 15:09:16.415207 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 15:09:16 crc kubenswrapper[4774]: I1003 15:09:16.415239 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 15:09:16 crc kubenswrapper[4774]: I1003 15:09:16.415651 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 15:09:16 crc kubenswrapper[4774]: I1003 15:09:16.416038 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7bdzq" Oct 03 15:09:16 crc kubenswrapper[4774]: I1003 15:09:16.432328 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bh568"] Oct 03 15:09:16 crc kubenswrapper[4774]: I1003 15:09:16.552790 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/58fbe1ae-46c5-4bb6-99ec-61ca05d737b1-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bh568\" (UID: \"58fbe1ae-46c5-4bb6-99ec-61ca05d737b1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bh568" Oct 03 15:09:16 crc kubenswrapper[4774]: I1003 15:09:16.553090 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58fbe1ae-46c5-4bb6-99ec-61ca05d737b1-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bh568\" (UID: \"58fbe1ae-46c5-4bb6-99ec-61ca05d737b1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bh568" Oct 03 15:09:16 crc kubenswrapper[4774]: I1003 15:09:16.553302 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fkd7\" (UniqueName: \"kubernetes.io/projected/58fbe1ae-46c5-4bb6-99ec-61ca05d737b1-kube-api-access-2fkd7\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bh568\" (UID: \"58fbe1ae-46c5-4bb6-99ec-61ca05d737b1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bh568" Oct 03 15:09:16 crc kubenswrapper[4774]: I1003 15:09:16.655465 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fkd7\" (UniqueName: \"kubernetes.io/projected/58fbe1ae-46c5-4bb6-99ec-61ca05d737b1-kube-api-access-2fkd7\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bh568\" (UID: \"58fbe1ae-46c5-4bb6-99ec-61ca05d737b1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bh568" Oct 03 15:09:16 crc kubenswrapper[4774]: I1003 15:09:16.655678 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/58fbe1ae-46c5-4bb6-99ec-61ca05d737b1-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bh568\" (UID: \"58fbe1ae-46c5-4bb6-99ec-61ca05d737b1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bh568" Oct 03 15:09:16 crc kubenswrapper[4774]: I1003 15:09:16.655884 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58fbe1ae-46c5-4bb6-99ec-61ca05d737b1-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bh568\" (UID: \"58fbe1ae-46c5-4bb6-99ec-61ca05d737b1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bh568" Oct 03 15:09:16 crc kubenswrapper[4774]: I1003 15:09:16.661004 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58fbe1ae-46c5-4bb6-99ec-61ca05d737b1-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bh568\" (UID: \"58fbe1ae-46c5-4bb6-99ec-61ca05d737b1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bh568" Oct 03 15:09:16 crc kubenswrapper[4774]: I1003 15:09:16.661901 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/58fbe1ae-46c5-4bb6-99ec-61ca05d737b1-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bh568\" (UID: \"58fbe1ae-46c5-4bb6-99ec-61ca05d737b1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bh568" Oct 03 15:09:16 crc kubenswrapper[4774]: I1003 15:09:16.682324 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fkd7\" (UniqueName: \"kubernetes.io/projected/58fbe1ae-46c5-4bb6-99ec-61ca05d737b1-kube-api-access-2fkd7\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bh568\" (UID: \"58fbe1ae-46c5-4bb6-99ec-61ca05d737b1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bh568" Oct 03 15:09:16 crc kubenswrapper[4774]: I1003 15:09:16.729985 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bh568" Oct 03 15:09:17 crc kubenswrapper[4774]: I1003 15:09:17.276711 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bh568"] Oct 03 15:09:17 crc kubenswrapper[4774]: I1003 15:09:17.321086 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bh568" event={"ID":"58fbe1ae-46c5-4bb6-99ec-61ca05d737b1","Type":"ContainerStarted","Data":"b05cd0c297b2830fa567bab552cb5947785a169fcbe4b6473b210c9a43b0cfc3"} Oct 03 15:09:18 crc kubenswrapper[4774]: I1003 15:09:18.330569 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bh568" event={"ID":"58fbe1ae-46c5-4bb6-99ec-61ca05d737b1","Type":"ContainerStarted","Data":"14082886358b583f9494294fe1af1aa706875b1e8c2338338229a7c8e6ab618c"} Oct 03 15:09:18 crc kubenswrapper[4774]: I1003 15:09:18.355030 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bh568" podStartSLOduration=1.7777296310000001 podStartE2EDuration="2.355007509s" podCreationTimestamp="2025-10-03 15:09:16 +0000 UTC" firstStartedPulling="2025-10-03 15:09:17.281301992 +0000 UTC m=+1579.870505444" lastFinishedPulling="2025-10-03 15:09:17.85857987 +0000 UTC m=+1580.447783322" observedRunningTime="2025-10-03 15:09:18.34778929 +0000 UTC m=+1580.936992752" watchObservedRunningTime="2025-10-03 15:09:18.355007509 +0000 UTC m=+1580.944210971" Oct 03 15:09:20 crc kubenswrapper[4774]: I1003 15:09:20.653766 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:09:20 crc kubenswrapper[4774]: I1003 15:09:20.654021 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:09:20 crc kubenswrapper[4774]: I1003 15:09:20.654061 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" Oct 03 15:09:20 crc kubenswrapper[4774]: I1003 15:09:20.654892 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"74d4d4fd29e66b3f46bed47cb8482d6e253bf183444fa990f8669734114e0846"} pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 15:09:20 crc kubenswrapper[4774]: I1003 15:09:20.654940 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" containerID="cri-o://74d4d4fd29e66b3f46bed47cb8482d6e253bf183444fa990f8669734114e0846" gracePeriod=600 Oct 03 15:09:20 crc kubenswrapper[4774]: E1003 15:09:20.776317 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:09:21 crc kubenswrapper[4774]: I1003 15:09:21.370806 4774 generic.go:334] "Generic (PLEG): container finished" podID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerID="74d4d4fd29e66b3f46bed47cb8482d6e253bf183444fa990f8669734114e0846" exitCode=0 Oct 03 15:09:21 crc kubenswrapper[4774]: I1003 15:09:21.370851 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" event={"ID":"ca37ac4b-f421-4198-a179-12901d36f0f5","Type":"ContainerDied","Data":"74d4d4fd29e66b3f46bed47cb8482d6e253bf183444fa990f8669734114e0846"} Oct 03 15:09:21 crc kubenswrapper[4774]: I1003 15:09:21.370914 4774 scope.go:117] "RemoveContainer" containerID="44818931129eda9720850f3c5b49565e0ee25d9e624e68b358f2ade90a0039e5" Oct 03 15:09:21 crc kubenswrapper[4774]: I1003 15:09:21.371763 4774 scope.go:117] "RemoveContainer" containerID="74d4d4fd29e66b3f46bed47cb8482d6e253bf183444fa990f8669734114e0846" Oct 03 15:09:21 crc kubenswrapper[4774]: E1003 15:09:21.372102 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:09:34 crc kubenswrapper[4774]: I1003 15:09:34.301906 4774 scope.go:117] "RemoveContainer" containerID="74d4d4fd29e66b3f46bed47cb8482d6e253bf183444fa990f8669734114e0846" Oct 03 15:09:34 crc kubenswrapper[4774]: E1003 15:09:34.303674 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:09:49 crc kubenswrapper[4774]: I1003 15:09:49.307619 4774 scope.go:117] "RemoveContainer" containerID="74d4d4fd29e66b3f46bed47cb8482d6e253bf183444fa990f8669734114e0846" Oct 03 15:09:49 crc kubenswrapper[4774]: E1003 15:09:49.308366 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:09:50 crc kubenswrapper[4774]: I1003 15:09:50.055953 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-jpslj"] Oct 03 15:09:50 crc kubenswrapper[4774]: I1003 15:09:50.070612 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-jpslj"] Oct 03 15:09:51 crc kubenswrapper[4774]: I1003 15:09:51.318249 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="109aa946-04bb-44f0-be87-89af984dd880" path="/var/lib/kubelet/pods/109aa946-04bb-44f0-be87-89af984dd880/volumes" Oct 03 15:09:54 crc kubenswrapper[4774]: I1003 15:09:54.045721 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-7sz4d"] Oct 03 15:09:54 crc kubenswrapper[4774]: I1003 15:09:54.062800 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-m8z68"] Oct 03 15:09:54 crc kubenswrapper[4774]: I1003 15:09:54.073209 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-7sz4d"] Oct 03 15:09:54 crc kubenswrapper[4774]: I1003 15:09:54.082574 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-m8z68"] Oct 03 15:09:55 crc kubenswrapper[4774]: I1003 15:09:55.310513 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dd8b18c-1808-48ac-8375-7cc2aff05b12" path="/var/lib/kubelet/pods/0dd8b18c-1808-48ac-8375-7cc2aff05b12/volumes" Oct 03 15:09:55 crc kubenswrapper[4774]: I1003 15:09:55.311034 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb71d2a9-eead-444a-ab69-cd9315317392" path="/var/lib/kubelet/pods/bb71d2a9-eead-444a-ab69-cd9315317392/volumes" Oct 03 15:10:01 crc kubenswrapper[4774]: I1003 15:10:01.070232 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-c7c3-account-create-97bj7"] Oct 03 15:10:01 crc kubenswrapper[4774]: I1003 15:10:01.082278 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-eb1a-account-create-6smf5"] Oct 03 15:10:01 crc kubenswrapper[4774]: I1003 15:10:01.092799 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-c7c3-account-create-97bj7"] Oct 03 15:10:01 crc kubenswrapper[4774]: I1003 15:10:01.101186 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-eb1a-account-create-6smf5"] Oct 03 15:10:01 crc kubenswrapper[4774]: I1003 15:10:01.336505 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fc94371-a912-4891-a328-1bc2b840fa61" path="/var/lib/kubelet/pods/1fc94371-a912-4891-a328-1bc2b840fa61/volumes" Oct 03 15:10:01 crc kubenswrapper[4774]: I1003 15:10:01.337835 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40ae99e2-2da0-4b3c-b1e4-92101379dcbf" path="/var/lib/kubelet/pods/40ae99e2-2da0-4b3c-b1e4-92101379dcbf/volumes" Oct 03 15:10:03 crc kubenswrapper[4774]: I1003 15:10:03.306276 4774 scope.go:117] "RemoveContainer" containerID="74d4d4fd29e66b3f46bed47cb8482d6e253bf183444fa990f8669734114e0846" Oct 03 15:10:03 crc kubenswrapper[4774]: E1003 15:10:03.306726 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:10:16 crc kubenswrapper[4774]: I1003 15:10:16.299562 4774 scope.go:117] "RemoveContainer" containerID="74d4d4fd29e66b3f46bed47cb8482d6e253bf183444fa990f8669734114e0846" Oct 03 15:10:16 crc kubenswrapper[4774]: E1003 15:10:16.300802 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:10:19 crc kubenswrapper[4774]: I1003 15:10:19.030451 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-418c-account-create-6xbkh"] Oct 03 15:10:19 crc kubenswrapper[4774]: I1003 15:10:19.047067 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-418c-account-create-6xbkh"] Oct 03 15:10:19 crc kubenswrapper[4774]: I1003 15:10:19.311059 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eff4b471-cbe1-458d-b00a-9ab69a909afc" path="/var/lib/kubelet/pods/eff4b471-cbe1-458d-b00a-9ab69a909afc/volumes" Oct 03 15:10:21 crc kubenswrapper[4774]: I1003 15:10:21.033532 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-zhgsz"] Oct 03 15:10:21 crc kubenswrapper[4774]: I1003 15:10:21.041622 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-zhgsz"] Oct 03 15:10:21 crc kubenswrapper[4774]: I1003 15:10:21.310585 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb7d18d2-5f77-4f17-8fce-7ea5b663a23c" path="/var/lib/kubelet/pods/bb7d18d2-5f77-4f17-8fce-7ea5b663a23c/volumes" Oct 03 15:10:24 crc kubenswrapper[4774]: I1003 15:10:24.031972 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-7qrp9"] Oct 03 15:10:24 crc kubenswrapper[4774]: I1003 15:10:24.040629 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-4t58x"] Oct 03 15:10:24 crc kubenswrapper[4774]: I1003 15:10:24.047728 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-4t58x"] Oct 03 15:10:24 crc kubenswrapper[4774]: I1003 15:10:24.056816 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-7qrp9"] Oct 03 15:10:24 crc kubenswrapper[4774]: I1003 15:10:24.446430 4774 scope.go:117] "RemoveContainer" containerID="52728a933e58597fffa7010cd561662a26848eab1cc304ee752cd60cc1acdf00" Oct 03 15:10:24 crc kubenswrapper[4774]: I1003 15:10:24.481626 4774 scope.go:117] "RemoveContainer" containerID="c0f76ae4203ec72e832ffd26f69a5a6ae33ff6fd5a597716122dfcae726f839d" Oct 03 15:10:24 crc kubenswrapper[4774]: I1003 15:10:24.533046 4774 scope.go:117] "RemoveContainer" containerID="6bed31e5afaa827109722d902bec7fe86975cf90c18bbb62ab8c3a3f56f37b11" Oct 03 15:10:24 crc kubenswrapper[4774]: I1003 15:10:24.591761 4774 scope.go:117] "RemoveContainer" containerID="701fe0ed81ae955a0e61706167acc81a8fac0753da1cee9ba2afb8b71a01d0cd" Oct 03 15:10:24 crc kubenswrapper[4774]: I1003 15:10:24.645217 4774 scope.go:117] "RemoveContainer" containerID="e5a9291ddc286e46dade07b7c1330d3089b9bfff3d2dffbf26f97751487d00a0" Oct 03 15:10:24 crc kubenswrapper[4774]: I1003 15:10:24.692961 4774 scope.go:117] "RemoveContainer" containerID="291639694edbf27b3b0404aa7375f128dd2eedc659d823a68f0102c9b365ea54" Oct 03 15:10:24 crc kubenswrapper[4774]: I1003 15:10:24.734295 4774 scope.go:117] "RemoveContainer" containerID="03fffac536e16617026539d41dbde80ddf4cb3b5b0cce719929c5aecfc217adf" Oct 03 15:10:25 crc kubenswrapper[4774]: I1003 15:10:25.311716 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13144fd0-9c06-480f-8809-1021c8f2ccd3" path="/var/lib/kubelet/pods/13144fd0-9c06-480f-8809-1021c8f2ccd3/volumes" Oct 03 15:10:25 crc kubenswrapper[4774]: I1003 15:10:25.312239 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5dad4cd-6b95-4b92-804a-c94135bff5ae" path="/var/lib/kubelet/pods/b5dad4cd-6b95-4b92-804a-c94135bff5ae/volumes" Oct 03 15:10:28 crc kubenswrapper[4774]: I1003 15:10:28.049556 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-k72d4"] Oct 03 15:10:28 crc kubenswrapper[4774]: I1003 15:10:28.060471 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-k72d4"] Oct 03 15:10:29 crc kubenswrapper[4774]: I1003 15:10:29.045052 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-x6k8t"] Oct 03 15:10:29 crc kubenswrapper[4774]: I1003 15:10:29.055956 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-a1da-account-create-z4r4k"] Oct 03 15:10:29 crc kubenswrapper[4774]: I1003 15:10:29.066523 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-a1da-account-create-z4r4k"] Oct 03 15:10:29 crc kubenswrapper[4774]: I1003 15:10:29.076772 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-x6k8t"] Oct 03 15:10:29 crc kubenswrapper[4774]: I1003 15:10:29.307842 4774 scope.go:117] "RemoveContainer" containerID="74d4d4fd29e66b3f46bed47cb8482d6e253bf183444fa990f8669734114e0846" Oct 03 15:10:29 crc kubenswrapper[4774]: E1003 15:10:29.308425 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:10:29 crc kubenswrapper[4774]: I1003 15:10:29.314404 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10568dd1-d320-4fed-b12b-7ded3500a3e9" path="/var/lib/kubelet/pods/10568dd1-d320-4fed-b12b-7ded3500a3e9/volumes" Oct 03 15:10:29 crc kubenswrapper[4774]: I1003 15:10:29.315511 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66ac98ed-7e19-455d-825f-87ef2b381b43" path="/var/lib/kubelet/pods/66ac98ed-7e19-455d-825f-87ef2b381b43/volumes" Oct 03 15:10:29 crc kubenswrapper[4774]: I1003 15:10:29.316660 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c2fdc49-a155-4a5b-afce-314af82e3f5b" path="/var/lib/kubelet/pods/6c2fdc49-a155-4a5b-afce-314af82e3f5b/volumes" Oct 03 15:10:41 crc kubenswrapper[4774]: I1003 15:10:41.299903 4774 scope.go:117] "RemoveContainer" containerID="74d4d4fd29e66b3f46bed47cb8482d6e253bf183444fa990f8669734114e0846" Oct 03 15:10:41 crc kubenswrapper[4774]: E1003 15:10:41.303323 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:10:56 crc kubenswrapper[4774]: I1003 15:10:56.300165 4774 scope.go:117] "RemoveContainer" containerID="74d4d4fd29e66b3f46bed47cb8482d6e253bf183444fa990f8669734114e0846" Oct 03 15:10:56 crc kubenswrapper[4774]: E1003 15:10:56.301217 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:10:58 crc kubenswrapper[4774]: I1003 15:10:58.504059 4774 generic.go:334] "Generic (PLEG): container finished" podID="58fbe1ae-46c5-4bb6-99ec-61ca05d737b1" containerID="14082886358b583f9494294fe1af1aa706875b1e8c2338338229a7c8e6ab618c" exitCode=0 Oct 03 15:10:58 crc kubenswrapper[4774]: I1003 15:10:58.504169 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bh568" event={"ID":"58fbe1ae-46c5-4bb6-99ec-61ca05d737b1","Type":"ContainerDied","Data":"14082886358b583f9494294fe1af1aa706875b1e8c2338338229a7c8e6ab618c"} Oct 03 15:11:00 crc kubenswrapper[4774]: I1003 15:11:00.044258 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bh568" Oct 03 15:11:00 crc kubenswrapper[4774]: I1003 15:11:00.143268 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fkd7\" (UniqueName: \"kubernetes.io/projected/58fbe1ae-46c5-4bb6-99ec-61ca05d737b1-kube-api-access-2fkd7\") pod \"58fbe1ae-46c5-4bb6-99ec-61ca05d737b1\" (UID: \"58fbe1ae-46c5-4bb6-99ec-61ca05d737b1\") " Oct 03 15:11:00 crc kubenswrapper[4774]: I1003 15:11:00.143548 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58fbe1ae-46c5-4bb6-99ec-61ca05d737b1-inventory\") pod \"58fbe1ae-46c5-4bb6-99ec-61ca05d737b1\" (UID: \"58fbe1ae-46c5-4bb6-99ec-61ca05d737b1\") " Oct 03 15:11:00 crc kubenswrapper[4774]: I1003 15:11:00.143613 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/58fbe1ae-46c5-4bb6-99ec-61ca05d737b1-ssh-key\") pod \"58fbe1ae-46c5-4bb6-99ec-61ca05d737b1\" (UID: \"58fbe1ae-46c5-4bb6-99ec-61ca05d737b1\") " Oct 03 15:11:00 crc kubenswrapper[4774]: I1003 15:11:00.150711 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58fbe1ae-46c5-4bb6-99ec-61ca05d737b1-kube-api-access-2fkd7" (OuterVolumeSpecName: "kube-api-access-2fkd7") pod "58fbe1ae-46c5-4bb6-99ec-61ca05d737b1" (UID: "58fbe1ae-46c5-4bb6-99ec-61ca05d737b1"). InnerVolumeSpecName "kube-api-access-2fkd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:11:00 crc kubenswrapper[4774]: I1003 15:11:00.175501 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58fbe1ae-46c5-4bb6-99ec-61ca05d737b1-inventory" (OuterVolumeSpecName: "inventory") pod "58fbe1ae-46c5-4bb6-99ec-61ca05d737b1" (UID: "58fbe1ae-46c5-4bb6-99ec-61ca05d737b1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:11:00 crc kubenswrapper[4774]: I1003 15:11:00.189014 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58fbe1ae-46c5-4bb6-99ec-61ca05d737b1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "58fbe1ae-46c5-4bb6-99ec-61ca05d737b1" (UID: "58fbe1ae-46c5-4bb6-99ec-61ca05d737b1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:11:00 crc kubenswrapper[4774]: I1003 15:11:00.246792 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fkd7\" (UniqueName: \"kubernetes.io/projected/58fbe1ae-46c5-4bb6-99ec-61ca05d737b1-kube-api-access-2fkd7\") on node \"crc\" DevicePath \"\"" Oct 03 15:11:00 crc kubenswrapper[4774]: I1003 15:11:00.246830 4774 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58fbe1ae-46c5-4bb6-99ec-61ca05d737b1-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 15:11:00 crc kubenswrapper[4774]: I1003 15:11:00.246844 4774 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/58fbe1ae-46c5-4bb6-99ec-61ca05d737b1-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 15:11:00 crc kubenswrapper[4774]: I1003 15:11:00.529829 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bh568" event={"ID":"58fbe1ae-46c5-4bb6-99ec-61ca05d737b1","Type":"ContainerDied","Data":"b05cd0c297b2830fa567bab552cb5947785a169fcbe4b6473b210c9a43b0cfc3"} Oct 03 15:11:00 crc kubenswrapper[4774]: I1003 15:11:00.529897 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b05cd0c297b2830fa567bab552cb5947785a169fcbe4b6473b210c9a43b0cfc3" Oct 03 15:11:00 crc kubenswrapper[4774]: I1003 15:11:00.529991 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bh568" Oct 03 15:11:00 crc kubenswrapper[4774]: I1003 15:11:00.651260 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pnvh8"] Oct 03 15:11:00 crc kubenswrapper[4774]: E1003 15:11:00.651892 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58fbe1ae-46c5-4bb6-99ec-61ca05d737b1" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 03 15:11:00 crc kubenswrapper[4774]: I1003 15:11:00.651908 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="58fbe1ae-46c5-4bb6-99ec-61ca05d737b1" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 03 15:11:00 crc kubenswrapper[4774]: I1003 15:11:00.652080 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="58fbe1ae-46c5-4bb6-99ec-61ca05d737b1" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 03 15:11:00 crc kubenswrapper[4774]: I1003 15:11:00.652657 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pnvh8" Oct 03 15:11:00 crc kubenswrapper[4774]: I1003 15:11:00.665655 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 15:11:00 crc kubenswrapper[4774]: I1003 15:11:00.665897 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7bdzq" Oct 03 15:11:00 crc kubenswrapper[4774]: I1003 15:11:00.665934 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 15:11:00 crc kubenswrapper[4774]: I1003 15:11:00.666427 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 15:11:00 crc kubenswrapper[4774]: I1003 15:11:00.680260 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pnvh8"] Oct 03 15:11:00 crc kubenswrapper[4774]: I1003 15:11:00.768609 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9d27605e-3b35-4000-a3d5-88cecbf24b5a-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pnvh8\" (UID: \"9d27605e-3b35-4000-a3d5-88cecbf24b5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pnvh8" Oct 03 15:11:00 crc kubenswrapper[4774]: I1003 15:11:00.768776 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9d27605e-3b35-4000-a3d5-88cecbf24b5a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pnvh8\" (UID: \"9d27605e-3b35-4000-a3d5-88cecbf24b5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pnvh8" Oct 03 15:11:00 crc kubenswrapper[4774]: I1003 15:11:00.768861 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9chp\" (UniqueName: \"kubernetes.io/projected/9d27605e-3b35-4000-a3d5-88cecbf24b5a-kube-api-access-b9chp\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pnvh8\" (UID: \"9d27605e-3b35-4000-a3d5-88cecbf24b5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pnvh8" Oct 03 15:11:00 crc kubenswrapper[4774]: I1003 15:11:00.870733 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9chp\" (UniqueName: \"kubernetes.io/projected/9d27605e-3b35-4000-a3d5-88cecbf24b5a-kube-api-access-b9chp\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pnvh8\" (UID: \"9d27605e-3b35-4000-a3d5-88cecbf24b5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pnvh8" Oct 03 15:11:00 crc kubenswrapper[4774]: I1003 15:11:00.870968 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9d27605e-3b35-4000-a3d5-88cecbf24b5a-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pnvh8\" (UID: \"9d27605e-3b35-4000-a3d5-88cecbf24b5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pnvh8" Oct 03 15:11:00 crc kubenswrapper[4774]: I1003 15:11:00.871285 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9d27605e-3b35-4000-a3d5-88cecbf24b5a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pnvh8\" (UID: \"9d27605e-3b35-4000-a3d5-88cecbf24b5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pnvh8" Oct 03 15:11:00 crc kubenswrapper[4774]: I1003 15:11:00.877856 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9d27605e-3b35-4000-a3d5-88cecbf24b5a-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pnvh8\" (UID: \"9d27605e-3b35-4000-a3d5-88cecbf24b5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pnvh8" Oct 03 15:11:00 crc kubenswrapper[4774]: I1003 15:11:00.879183 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9d27605e-3b35-4000-a3d5-88cecbf24b5a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pnvh8\" (UID: \"9d27605e-3b35-4000-a3d5-88cecbf24b5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pnvh8" Oct 03 15:11:00 crc kubenswrapper[4774]: I1003 15:11:00.904167 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9chp\" (UniqueName: \"kubernetes.io/projected/9d27605e-3b35-4000-a3d5-88cecbf24b5a-kube-api-access-b9chp\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pnvh8\" (UID: \"9d27605e-3b35-4000-a3d5-88cecbf24b5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pnvh8" Oct 03 15:11:00 crc kubenswrapper[4774]: I1003 15:11:00.983433 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pnvh8" Oct 03 15:11:01 crc kubenswrapper[4774]: I1003 15:11:01.055807 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-2589-account-create-zksfs"] Oct 03 15:11:01 crc kubenswrapper[4774]: I1003 15:11:01.071368 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5fbc-account-create-czgkc"] Oct 03 15:11:01 crc kubenswrapper[4774]: I1003 15:11:01.114448 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-2589-account-create-zksfs"] Oct 03 15:11:01 crc kubenswrapper[4774]: I1003 15:11:01.165145 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5fbc-account-create-czgkc"] Oct 03 15:11:01 crc kubenswrapper[4774]: I1003 15:11:01.310674 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="066209ed-a6f1-4363-89d7-ad3a9b865341" path="/var/lib/kubelet/pods/066209ed-a6f1-4363-89d7-ad3a9b865341/volumes" Oct 03 15:11:01 crc kubenswrapper[4774]: I1003 15:11:01.311398 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcf3e07f-94a6-4786-9ef9-a7cb9f1562b0" path="/var/lib/kubelet/pods/fcf3e07f-94a6-4786-9ef9-a7cb9f1562b0/volumes" Oct 03 15:11:01 crc kubenswrapper[4774]: I1003 15:11:01.611347 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pnvh8"] Oct 03 15:11:01 crc kubenswrapper[4774]: I1003 15:11:01.617298 4774 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 15:11:02 crc kubenswrapper[4774]: I1003 15:11:02.051245 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-f2ct5"] Oct 03 15:11:02 crc kubenswrapper[4774]: I1003 15:11:02.065989 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-f2ct5"] Oct 03 15:11:02 crc kubenswrapper[4774]: I1003 15:11:02.556854 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pnvh8" event={"ID":"9d27605e-3b35-4000-a3d5-88cecbf24b5a","Type":"ContainerStarted","Data":"f839fc00bbfccb91fb8581540cad708479d88e640b375c701b9eb68503b12f26"} Oct 03 15:11:02 crc kubenswrapper[4774]: I1003 15:11:02.557145 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pnvh8" event={"ID":"9d27605e-3b35-4000-a3d5-88cecbf24b5a","Type":"ContainerStarted","Data":"85e577f57300a7de1ce07f76434c8ad453b64764d09921763d2278bbf509d332"} Oct 03 15:11:02 crc kubenswrapper[4774]: I1003 15:11:02.580226 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pnvh8" podStartSLOduration=1.9784521449999999 podStartE2EDuration="2.580203501s" podCreationTimestamp="2025-10-03 15:11:00 +0000 UTC" firstStartedPulling="2025-10-03 15:11:01.616942356 +0000 UTC m=+1684.206145818" lastFinishedPulling="2025-10-03 15:11:02.218693722 +0000 UTC m=+1684.807897174" observedRunningTime="2025-10-03 15:11:02.579622657 +0000 UTC m=+1685.168826109" watchObservedRunningTime="2025-10-03 15:11:02.580203501 +0000 UTC m=+1685.169406953" Oct 03 15:11:03 crc kubenswrapper[4774]: I1003 15:11:03.320150 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce469a02-5678-42c1-84d7-21a29c1b3d18" path="/var/lib/kubelet/pods/ce469a02-5678-42c1-84d7-21a29c1b3d18/volumes" Oct 03 15:11:09 crc kubenswrapper[4774]: I1003 15:11:09.038330 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-w7hcx"] Oct 03 15:11:09 crc kubenswrapper[4774]: I1003 15:11:09.048002 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-w7hcx"] Oct 03 15:11:09 crc kubenswrapper[4774]: I1003 15:11:09.318574 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7755d164-f1c7-4f58-91d4-5ac4ab948090" path="/var/lib/kubelet/pods/7755d164-f1c7-4f58-91d4-5ac4ab948090/volumes" Oct 03 15:11:11 crc kubenswrapper[4774]: I1003 15:11:11.299751 4774 scope.go:117] "RemoveContainer" containerID="74d4d4fd29e66b3f46bed47cb8482d6e253bf183444fa990f8669734114e0846" Oct 03 15:11:11 crc kubenswrapper[4774]: E1003 15:11:11.300314 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:11:24 crc kubenswrapper[4774]: I1003 15:11:24.877810 4774 scope.go:117] "RemoveContainer" containerID="8c116304c75bf62309ede65631dad7f489deaa5d260dfc5583261549653bb8c9" Oct 03 15:11:24 crc kubenswrapper[4774]: I1003 15:11:24.909092 4774 scope.go:117] "RemoveContainer" containerID="a94a76982647cf91a85aa74e07b8e59fb2f08a059d57bfef113647c736fe4e63" Oct 03 15:11:24 crc kubenswrapper[4774]: I1003 15:11:24.950177 4774 scope.go:117] "RemoveContainer" containerID="81e49b5c658d68212ef1d604598bf80afc030621e90b6eb6d556d4fc17280e35" Oct 03 15:11:25 crc kubenswrapper[4774]: I1003 15:11:25.030442 4774 scope.go:117] "RemoveContainer" containerID="c900d8688d36e8c1c04f32c90d1c12885d723ac8e764adebcd57c448a592060c" Oct 03 15:11:25 crc kubenswrapper[4774]: I1003 15:11:25.052170 4774 scope.go:117] "RemoveContainer" containerID="ab01957b32518039713a0e69498e4214d0b7b55eeb3d23b81bf87c15881fa6f6" Oct 03 15:11:25 crc kubenswrapper[4774]: I1003 15:11:25.093531 4774 scope.go:117] "RemoveContainer" containerID="ddf6f6d44c9b2467a9d47d4ffb5acdd138bfec1bd41c008d98dbe3fbd1096afb" Oct 03 15:11:25 crc kubenswrapper[4774]: I1003 15:11:25.130597 4774 scope.go:117] "RemoveContainer" containerID="57bcf11aa1d70c62aae6c53fcb319fd0412bb5ee497f9656611a4b4bf0ea9c52" Oct 03 15:11:25 crc kubenswrapper[4774]: I1003 15:11:25.151685 4774 scope.go:117] "RemoveContainer" containerID="7b39a8fe987c0e29a566f711e63a46f7eb92578ea76b8c8df8f17634c9385a91" Oct 03 15:11:25 crc kubenswrapper[4774]: I1003 15:11:25.169277 4774 scope.go:117] "RemoveContainer" containerID="6fb064fe23f4aa8e4bd57b9b12104c84f6132e63dc26661ddd367b3baae405dc" Oct 03 15:11:26 crc kubenswrapper[4774]: I1003 15:11:26.058209 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-gp26k"] Oct 03 15:11:26 crc kubenswrapper[4774]: I1003 15:11:26.073137 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-gp26k"] Oct 03 15:11:26 crc kubenswrapper[4774]: I1003 15:11:26.300560 4774 scope.go:117] "RemoveContainer" containerID="74d4d4fd29e66b3f46bed47cb8482d6e253bf183444fa990f8669734114e0846" Oct 03 15:11:26 crc kubenswrapper[4774]: E1003 15:11:26.301029 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:11:27 crc kubenswrapper[4774]: I1003 15:11:27.314568 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29f3857a-d03c-48ab-93c4-0d75fc497c0e" path="/var/lib/kubelet/pods/29f3857a-d03c-48ab-93c4-0d75fc497c0e/volumes" Oct 03 15:11:33 crc kubenswrapper[4774]: I1003 15:11:33.035493 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-sx6cl"] Oct 03 15:11:33 crc kubenswrapper[4774]: I1003 15:11:33.045032 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-sx6cl"] Oct 03 15:11:33 crc kubenswrapper[4774]: I1003 15:11:33.317168 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be96775d-6115-4c8e-8539-230de2424b0e" path="/var/lib/kubelet/pods/be96775d-6115-4c8e-8539-230de2424b0e/volumes" Oct 03 15:11:37 crc kubenswrapper[4774]: I1003 15:11:37.056351 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-66tpj"] Oct 03 15:11:37 crc kubenswrapper[4774]: I1003 15:11:37.071182 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-66tpj"] Oct 03 15:11:37 crc kubenswrapper[4774]: I1003 15:11:37.300360 4774 scope.go:117] "RemoveContainer" containerID="74d4d4fd29e66b3f46bed47cb8482d6e253bf183444fa990f8669734114e0846" Oct 03 15:11:37 crc kubenswrapper[4774]: E1003 15:11:37.301189 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:11:37 crc kubenswrapper[4774]: I1003 15:11:37.319192 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdc24810-0778-4b37-8156-ecac9ae9e077" path="/var/lib/kubelet/pods/cdc24810-0778-4b37-8156-ecac9ae9e077/volumes" Oct 03 15:11:51 crc kubenswrapper[4774]: I1003 15:11:51.298922 4774 scope.go:117] "RemoveContainer" containerID="74d4d4fd29e66b3f46bed47cb8482d6e253bf183444fa990f8669734114e0846" Oct 03 15:11:51 crc kubenswrapper[4774]: E1003 15:11:51.299968 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:12:01 crc kubenswrapper[4774]: I1003 15:12:01.062911 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-mgszf"] Oct 03 15:12:01 crc kubenswrapper[4774]: I1003 15:12:01.078156 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-8qxhd"] Oct 03 15:12:01 crc kubenswrapper[4774]: I1003 15:12:01.090260 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-tggq5"] Oct 03 15:12:01 crc kubenswrapper[4774]: I1003 15:12:01.099116 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-mgszf"] Oct 03 15:12:01 crc kubenswrapper[4774]: I1003 15:12:01.108028 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-tggq5"] Oct 03 15:12:01 crc kubenswrapper[4774]: I1003 15:12:01.116623 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-8qxhd"] Oct 03 15:12:01 crc kubenswrapper[4774]: I1003 15:12:01.310317 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="367d8aee-34b7-485e-848a-3e267afa8fd6" path="/var/lib/kubelet/pods/367d8aee-34b7-485e-848a-3e267afa8fd6/volumes" Oct 03 15:12:01 crc kubenswrapper[4774]: I1003 15:12:01.311029 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38b934ad-cf29-40d1-993c-8dcc6b8c0b8c" path="/var/lib/kubelet/pods/38b934ad-cf29-40d1-993c-8dcc6b8c0b8c/volumes" Oct 03 15:12:01 crc kubenswrapper[4774]: I1003 15:12:01.311770 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dde1c72-df79-4066-837c-0e318b636b73" path="/var/lib/kubelet/pods/6dde1c72-df79-4066-837c-0e318b636b73/volumes" Oct 03 15:12:02 crc kubenswrapper[4774]: I1003 15:12:02.300134 4774 scope.go:117] "RemoveContainer" containerID="74d4d4fd29e66b3f46bed47cb8482d6e253bf183444fa990f8669734114e0846" Oct 03 15:12:02 crc kubenswrapper[4774]: E1003 15:12:02.301040 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:12:08 crc kubenswrapper[4774]: I1003 15:12:08.048441 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-22ec-account-create-tbdq7"] Oct 03 15:12:08 crc kubenswrapper[4774]: I1003 15:12:08.059261 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-22ec-account-create-tbdq7"] Oct 03 15:12:09 crc kubenswrapper[4774]: I1003 15:12:09.041280 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-d56e-account-create-h6kfv"] Oct 03 15:12:09 crc kubenswrapper[4774]: I1003 15:12:09.056306 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-abbe-account-create-tmxqt"] Oct 03 15:12:09 crc kubenswrapper[4774]: I1003 15:12:09.066996 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-abbe-account-create-tmxqt"] Oct 03 15:12:09 crc kubenswrapper[4774]: I1003 15:12:09.075981 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-d56e-account-create-h6kfv"] Oct 03 15:12:09 crc kubenswrapper[4774]: I1003 15:12:09.313819 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55fbe210-dfe5-4488-9461-8b1f67d30f49" path="/var/lib/kubelet/pods/55fbe210-dfe5-4488-9461-8b1f67d30f49/volumes" Oct 03 15:12:09 crc kubenswrapper[4774]: I1003 15:12:09.314410 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc84e6e6-c0c7-44f6-8834-0e9f5e52f98c" path="/var/lib/kubelet/pods/bc84e6e6-c0c7-44f6-8834-0e9f5e52f98c/volumes" Oct 03 15:12:09 crc kubenswrapper[4774]: I1003 15:12:09.314882 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1965846-33db-41a7-b097-d26e5a398986" path="/var/lib/kubelet/pods/e1965846-33db-41a7-b097-d26e5a398986/volumes" Oct 03 15:12:14 crc kubenswrapper[4774]: I1003 15:12:14.299557 4774 scope.go:117] "RemoveContainer" containerID="74d4d4fd29e66b3f46bed47cb8482d6e253bf183444fa990f8669734114e0846" Oct 03 15:12:14 crc kubenswrapper[4774]: E1003 15:12:14.300361 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:12:19 crc kubenswrapper[4774]: I1003 15:12:19.401467 4774 generic.go:334] "Generic (PLEG): container finished" podID="9d27605e-3b35-4000-a3d5-88cecbf24b5a" containerID="f839fc00bbfccb91fb8581540cad708479d88e640b375c701b9eb68503b12f26" exitCode=0 Oct 03 15:12:19 crc kubenswrapper[4774]: I1003 15:12:19.401537 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pnvh8" event={"ID":"9d27605e-3b35-4000-a3d5-88cecbf24b5a","Type":"ContainerDied","Data":"f839fc00bbfccb91fb8581540cad708479d88e640b375c701b9eb68503b12f26"} Oct 03 15:12:20 crc kubenswrapper[4774]: I1003 15:12:20.803330 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pnvh8" Oct 03 15:12:20 crc kubenswrapper[4774]: I1003 15:12:20.987084 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9d27605e-3b35-4000-a3d5-88cecbf24b5a-inventory\") pod \"9d27605e-3b35-4000-a3d5-88cecbf24b5a\" (UID: \"9d27605e-3b35-4000-a3d5-88cecbf24b5a\") " Oct 03 15:12:20 crc kubenswrapper[4774]: I1003 15:12:20.987206 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9d27605e-3b35-4000-a3d5-88cecbf24b5a-ssh-key\") pod \"9d27605e-3b35-4000-a3d5-88cecbf24b5a\" (UID: \"9d27605e-3b35-4000-a3d5-88cecbf24b5a\") " Oct 03 15:12:20 crc kubenswrapper[4774]: I1003 15:12:20.987273 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9chp\" (UniqueName: \"kubernetes.io/projected/9d27605e-3b35-4000-a3d5-88cecbf24b5a-kube-api-access-b9chp\") pod \"9d27605e-3b35-4000-a3d5-88cecbf24b5a\" (UID: \"9d27605e-3b35-4000-a3d5-88cecbf24b5a\") " Oct 03 15:12:20 crc kubenswrapper[4774]: I1003 15:12:20.992084 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d27605e-3b35-4000-a3d5-88cecbf24b5a-kube-api-access-b9chp" (OuterVolumeSpecName: "kube-api-access-b9chp") pod "9d27605e-3b35-4000-a3d5-88cecbf24b5a" (UID: "9d27605e-3b35-4000-a3d5-88cecbf24b5a"). InnerVolumeSpecName "kube-api-access-b9chp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:12:21 crc kubenswrapper[4774]: I1003 15:12:21.012906 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d27605e-3b35-4000-a3d5-88cecbf24b5a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9d27605e-3b35-4000-a3d5-88cecbf24b5a" (UID: "9d27605e-3b35-4000-a3d5-88cecbf24b5a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:12:21 crc kubenswrapper[4774]: I1003 15:12:21.025490 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d27605e-3b35-4000-a3d5-88cecbf24b5a-inventory" (OuterVolumeSpecName: "inventory") pod "9d27605e-3b35-4000-a3d5-88cecbf24b5a" (UID: "9d27605e-3b35-4000-a3d5-88cecbf24b5a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:12:21 crc kubenswrapper[4774]: I1003 15:12:21.089568 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9chp\" (UniqueName: \"kubernetes.io/projected/9d27605e-3b35-4000-a3d5-88cecbf24b5a-kube-api-access-b9chp\") on node \"crc\" DevicePath \"\"" Oct 03 15:12:21 crc kubenswrapper[4774]: I1003 15:12:21.089603 4774 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9d27605e-3b35-4000-a3d5-88cecbf24b5a-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 15:12:21 crc kubenswrapper[4774]: I1003 15:12:21.089611 4774 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9d27605e-3b35-4000-a3d5-88cecbf24b5a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 15:12:21 crc kubenswrapper[4774]: I1003 15:12:21.425258 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pnvh8" event={"ID":"9d27605e-3b35-4000-a3d5-88cecbf24b5a","Type":"ContainerDied","Data":"85e577f57300a7de1ce07f76434c8ad453b64764d09921763d2278bbf509d332"} Oct 03 15:12:21 crc kubenswrapper[4774]: I1003 15:12:21.425305 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85e577f57300a7de1ce07f76434c8ad453b64764d09921763d2278bbf509d332" Oct 03 15:12:21 crc kubenswrapper[4774]: I1003 15:12:21.425361 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pnvh8" Oct 03 15:12:21 crc kubenswrapper[4774]: I1003 15:12:21.522197 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lkc9p"] Oct 03 15:12:21 crc kubenswrapper[4774]: E1003 15:12:21.522977 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d27605e-3b35-4000-a3d5-88cecbf24b5a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 03 15:12:21 crc kubenswrapper[4774]: I1003 15:12:21.523090 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d27605e-3b35-4000-a3d5-88cecbf24b5a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 03 15:12:21 crc kubenswrapper[4774]: I1003 15:12:21.523430 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d27605e-3b35-4000-a3d5-88cecbf24b5a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 03 15:12:21 crc kubenswrapper[4774]: I1003 15:12:21.526254 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lkc9p" Oct 03 15:12:21 crc kubenswrapper[4774]: I1003 15:12:21.530663 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 15:12:21 crc kubenswrapper[4774]: I1003 15:12:21.531323 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 15:12:21 crc kubenswrapper[4774]: I1003 15:12:21.531671 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 15:12:21 crc kubenswrapper[4774]: I1003 15:12:21.537872 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lkc9p"] Oct 03 15:12:21 crc kubenswrapper[4774]: I1003 15:12:21.539913 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7bdzq" Oct 03 15:12:21 crc kubenswrapper[4774]: I1003 15:12:21.701715 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b021b515-09e3-4fcd-b448-c8169043f86c-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lkc9p\" (UID: \"b021b515-09e3-4fcd-b448-c8169043f86c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lkc9p" Oct 03 15:12:21 crc kubenswrapper[4774]: I1003 15:12:21.702080 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dp6g\" (UniqueName: \"kubernetes.io/projected/b021b515-09e3-4fcd-b448-c8169043f86c-kube-api-access-5dp6g\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lkc9p\" (UID: \"b021b515-09e3-4fcd-b448-c8169043f86c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lkc9p" Oct 03 15:12:21 crc kubenswrapper[4774]: I1003 15:12:21.702184 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b021b515-09e3-4fcd-b448-c8169043f86c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lkc9p\" (UID: \"b021b515-09e3-4fcd-b448-c8169043f86c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lkc9p" Oct 03 15:12:21 crc kubenswrapper[4774]: I1003 15:12:21.803877 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b021b515-09e3-4fcd-b448-c8169043f86c-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lkc9p\" (UID: \"b021b515-09e3-4fcd-b448-c8169043f86c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lkc9p" Oct 03 15:12:21 crc kubenswrapper[4774]: I1003 15:12:21.803927 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dp6g\" (UniqueName: \"kubernetes.io/projected/b021b515-09e3-4fcd-b448-c8169043f86c-kube-api-access-5dp6g\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lkc9p\" (UID: \"b021b515-09e3-4fcd-b448-c8169043f86c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lkc9p" Oct 03 15:12:21 crc kubenswrapper[4774]: I1003 15:12:21.804000 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b021b515-09e3-4fcd-b448-c8169043f86c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lkc9p\" (UID: \"b021b515-09e3-4fcd-b448-c8169043f86c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lkc9p" Oct 03 15:12:21 crc kubenswrapper[4774]: I1003 15:12:21.808748 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b021b515-09e3-4fcd-b448-c8169043f86c-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lkc9p\" (UID: \"b021b515-09e3-4fcd-b448-c8169043f86c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lkc9p" Oct 03 15:12:21 crc kubenswrapper[4774]: I1003 15:12:21.809304 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b021b515-09e3-4fcd-b448-c8169043f86c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lkc9p\" (UID: \"b021b515-09e3-4fcd-b448-c8169043f86c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lkc9p" Oct 03 15:12:21 crc kubenswrapper[4774]: I1003 15:12:21.824180 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dp6g\" (UniqueName: \"kubernetes.io/projected/b021b515-09e3-4fcd-b448-c8169043f86c-kube-api-access-5dp6g\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lkc9p\" (UID: \"b021b515-09e3-4fcd-b448-c8169043f86c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lkc9p" Oct 03 15:12:21 crc kubenswrapper[4774]: I1003 15:12:21.855231 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lkc9p" Oct 03 15:12:22 crc kubenswrapper[4774]: I1003 15:12:22.197875 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lkc9p"] Oct 03 15:12:22 crc kubenswrapper[4774]: I1003 15:12:22.455882 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lkc9p" event={"ID":"b021b515-09e3-4fcd-b448-c8169043f86c","Type":"ContainerStarted","Data":"01cfc77d7898ee834fba618c321a0c85630551c00733cf119174f41263174bc2"} Oct 03 15:12:23 crc kubenswrapper[4774]: I1003 15:12:23.464754 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lkc9p" event={"ID":"b021b515-09e3-4fcd-b448-c8169043f86c","Type":"ContainerStarted","Data":"d2daf948629296fcadc261a28438a6db8a0428bffcd2c2a8aab7dc37191983a0"} Oct 03 15:12:23 crc kubenswrapper[4774]: I1003 15:12:23.485862 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lkc9p" podStartSLOduration=1.56249046 podStartE2EDuration="2.485838083s" podCreationTimestamp="2025-10-03 15:12:21 +0000 UTC" firstStartedPulling="2025-10-03 15:12:22.20410916 +0000 UTC m=+1764.793312612" lastFinishedPulling="2025-10-03 15:12:23.127456773 +0000 UTC m=+1765.716660235" observedRunningTime="2025-10-03 15:12:23.482037149 +0000 UTC m=+1766.071240601" watchObservedRunningTime="2025-10-03 15:12:23.485838083 +0000 UTC m=+1766.075041545" Oct 03 15:12:25 crc kubenswrapper[4774]: I1003 15:12:25.361088 4774 scope.go:117] "RemoveContainer" containerID="9c25c44261e089e994c953912b63d0e3c8063795102245e228aa8fbeb35b0508" Oct 03 15:12:25 crc kubenswrapper[4774]: I1003 15:12:25.404366 4774 scope.go:117] "RemoveContainer" containerID="ce201ca2e39eab9f9957879b76b9964af29817ef6a45ec70c4b43086049a53a1" Oct 03 15:12:25 crc kubenswrapper[4774]: I1003 15:12:25.484199 4774 scope.go:117] "RemoveContainer" containerID="3e491844db25d065fe046339b9812d693015a5f1b6064363a997ecb709ecf30d" Oct 03 15:12:25 crc kubenswrapper[4774]: I1003 15:12:25.538361 4774 scope.go:117] "RemoveContainer" containerID="916528f6fadc363a6d990690802f342d8a661299deb5228e84ec48349173d3ba" Oct 03 15:12:25 crc kubenswrapper[4774]: I1003 15:12:25.567623 4774 scope.go:117] "RemoveContainer" containerID="b521d19d25187d27ea34d826f7b27192b58a98825368c09ffc418e2a3b582745" Oct 03 15:12:25 crc kubenswrapper[4774]: I1003 15:12:25.602116 4774 scope.go:117] "RemoveContainer" containerID="5329abe20e4af465ff237a25f80e8f2134a085e9f5caaede1ca3af4df38fd631" Oct 03 15:12:25 crc kubenswrapper[4774]: I1003 15:12:25.660490 4774 scope.go:117] "RemoveContainer" containerID="3721a4b95ab469a9fef4cef5499af290fd97dc2313b7266d4514c7d233652a61" Oct 03 15:12:25 crc kubenswrapper[4774]: I1003 15:12:25.707238 4774 scope.go:117] "RemoveContainer" containerID="a1756df0b2652de10e71806e9f194ae59a9fb4519adeccb603d2ae3212145a32" Oct 03 15:12:25 crc kubenswrapper[4774]: I1003 15:12:25.728446 4774 scope.go:117] "RemoveContainer" containerID="114b2abde0eaa8c2e671f489ff2547660d05d7223bfaa5bd2166012758f295d5" Oct 03 15:12:28 crc kubenswrapper[4774]: I1003 15:12:28.300548 4774 scope.go:117] "RemoveContainer" containerID="74d4d4fd29e66b3f46bed47cb8482d6e253bf183444fa990f8669734114e0846" Oct 03 15:12:28 crc kubenswrapper[4774]: E1003 15:12:28.301122 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:12:28 crc kubenswrapper[4774]: I1003 15:12:28.513083 4774 generic.go:334] "Generic (PLEG): container finished" podID="b021b515-09e3-4fcd-b448-c8169043f86c" containerID="d2daf948629296fcadc261a28438a6db8a0428bffcd2c2a8aab7dc37191983a0" exitCode=0 Oct 03 15:12:28 crc kubenswrapper[4774]: I1003 15:12:28.513122 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lkc9p" event={"ID":"b021b515-09e3-4fcd-b448-c8169043f86c","Type":"ContainerDied","Data":"d2daf948629296fcadc261a28438a6db8a0428bffcd2c2a8aab7dc37191983a0"} Oct 03 15:12:29 crc kubenswrapper[4774]: I1003 15:12:29.970825 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lkc9p" Oct 03 15:12:30 crc kubenswrapper[4774]: I1003 15:12:30.068202 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dp6g\" (UniqueName: \"kubernetes.io/projected/b021b515-09e3-4fcd-b448-c8169043f86c-kube-api-access-5dp6g\") pod \"b021b515-09e3-4fcd-b448-c8169043f86c\" (UID: \"b021b515-09e3-4fcd-b448-c8169043f86c\") " Oct 03 15:12:30 crc kubenswrapper[4774]: I1003 15:12:30.068413 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b021b515-09e3-4fcd-b448-c8169043f86c-inventory\") pod \"b021b515-09e3-4fcd-b448-c8169043f86c\" (UID: \"b021b515-09e3-4fcd-b448-c8169043f86c\") " Oct 03 15:12:30 crc kubenswrapper[4774]: I1003 15:12:30.068529 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b021b515-09e3-4fcd-b448-c8169043f86c-ssh-key\") pod \"b021b515-09e3-4fcd-b448-c8169043f86c\" (UID: \"b021b515-09e3-4fcd-b448-c8169043f86c\") " Oct 03 15:12:30 crc kubenswrapper[4774]: I1003 15:12:30.074183 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b021b515-09e3-4fcd-b448-c8169043f86c-kube-api-access-5dp6g" (OuterVolumeSpecName: "kube-api-access-5dp6g") pod "b021b515-09e3-4fcd-b448-c8169043f86c" (UID: "b021b515-09e3-4fcd-b448-c8169043f86c"). InnerVolumeSpecName "kube-api-access-5dp6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:12:30 crc kubenswrapper[4774]: I1003 15:12:30.116548 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b021b515-09e3-4fcd-b448-c8169043f86c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b021b515-09e3-4fcd-b448-c8169043f86c" (UID: "b021b515-09e3-4fcd-b448-c8169043f86c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:12:30 crc kubenswrapper[4774]: I1003 15:12:30.117642 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b021b515-09e3-4fcd-b448-c8169043f86c-inventory" (OuterVolumeSpecName: "inventory") pod "b021b515-09e3-4fcd-b448-c8169043f86c" (UID: "b021b515-09e3-4fcd-b448-c8169043f86c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:12:30 crc kubenswrapper[4774]: I1003 15:12:30.170392 4774 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b021b515-09e3-4fcd-b448-c8169043f86c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 15:12:30 crc kubenswrapper[4774]: I1003 15:12:30.170428 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dp6g\" (UniqueName: \"kubernetes.io/projected/b021b515-09e3-4fcd-b448-c8169043f86c-kube-api-access-5dp6g\") on node \"crc\" DevicePath \"\"" Oct 03 15:12:30 crc kubenswrapper[4774]: I1003 15:12:30.170446 4774 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b021b515-09e3-4fcd-b448-c8169043f86c-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 15:12:30 crc kubenswrapper[4774]: I1003 15:12:30.540364 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lkc9p" event={"ID":"b021b515-09e3-4fcd-b448-c8169043f86c","Type":"ContainerDied","Data":"01cfc77d7898ee834fba618c321a0c85630551c00733cf119174f41263174bc2"} Oct 03 15:12:30 crc kubenswrapper[4774]: I1003 15:12:30.540469 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01cfc77d7898ee834fba618c321a0c85630551c00733cf119174f41263174bc2" Oct 03 15:12:30 crc kubenswrapper[4774]: I1003 15:12:30.540570 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lkc9p" Oct 03 15:12:30 crc kubenswrapper[4774]: I1003 15:12:30.614794 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-tzhjs"] Oct 03 15:12:30 crc kubenswrapper[4774]: E1003 15:12:30.615277 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b021b515-09e3-4fcd-b448-c8169043f86c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 03 15:12:30 crc kubenswrapper[4774]: I1003 15:12:30.615299 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="b021b515-09e3-4fcd-b448-c8169043f86c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 03 15:12:30 crc kubenswrapper[4774]: I1003 15:12:30.615533 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="b021b515-09e3-4fcd-b448-c8169043f86c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 03 15:12:30 crc kubenswrapper[4774]: I1003 15:12:30.616264 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tzhjs" Oct 03 15:12:30 crc kubenswrapper[4774]: I1003 15:12:30.618739 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 15:12:30 crc kubenswrapper[4774]: I1003 15:12:30.618796 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7bdzq" Oct 03 15:12:30 crc kubenswrapper[4774]: I1003 15:12:30.619613 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 15:12:30 crc kubenswrapper[4774]: I1003 15:12:30.620488 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 15:12:30 crc kubenswrapper[4774]: I1003 15:12:30.644455 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-tzhjs"] Oct 03 15:12:30 crc kubenswrapper[4774]: I1003 15:12:30.679932 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8595599-9969-4c2e-bc3a-f2ff038d8c11-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tzhjs\" (UID: \"c8595599-9969-4c2e-bc3a-f2ff038d8c11\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tzhjs" Oct 03 15:12:30 crc kubenswrapper[4774]: I1003 15:12:30.679978 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kw2d\" (UniqueName: \"kubernetes.io/projected/c8595599-9969-4c2e-bc3a-f2ff038d8c11-kube-api-access-2kw2d\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tzhjs\" (UID: \"c8595599-9969-4c2e-bc3a-f2ff038d8c11\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tzhjs" Oct 03 15:12:30 crc kubenswrapper[4774]: I1003 15:12:30.680094 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c8595599-9969-4c2e-bc3a-f2ff038d8c11-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tzhjs\" (UID: \"c8595599-9969-4c2e-bc3a-f2ff038d8c11\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tzhjs" Oct 03 15:12:30 crc kubenswrapper[4774]: I1003 15:12:30.781754 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c8595599-9969-4c2e-bc3a-f2ff038d8c11-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tzhjs\" (UID: \"c8595599-9969-4c2e-bc3a-f2ff038d8c11\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tzhjs" Oct 03 15:12:30 crc kubenswrapper[4774]: I1003 15:12:30.781863 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8595599-9969-4c2e-bc3a-f2ff038d8c11-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tzhjs\" (UID: \"c8595599-9969-4c2e-bc3a-f2ff038d8c11\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tzhjs" Oct 03 15:12:30 crc kubenswrapper[4774]: I1003 15:12:30.781892 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kw2d\" (UniqueName: \"kubernetes.io/projected/c8595599-9969-4c2e-bc3a-f2ff038d8c11-kube-api-access-2kw2d\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tzhjs\" (UID: \"c8595599-9969-4c2e-bc3a-f2ff038d8c11\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tzhjs" Oct 03 15:12:30 crc kubenswrapper[4774]: I1003 15:12:30.798179 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c8595599-9969-4c2e-bc3a-f2ff038d8c11-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tzhjs\" (UID: \"c8595599-9969-4c2e-bc3a-f2ff038d8c11\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tzhjs" Oct 03 15:12:30 crc kubenswrapper[4774]: I1003 15:12:30.798239 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8595599-9969-4c2e-bc3a-f2ff038d8c11-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tzhjs\" (UID: \"c8595599-9969-4c2e-bc3a-f2ff038d8c11\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tzhjs" Oct 03 15:12:30 crc kubenswrapper[4774]: I1003 15:12:30.803063 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kw2d\" (UniqueName: \"kubernetes.io/projected/c8595599-9969-4c2e-bc3a-f2ff038d8c11-kube-api-access-2kw2d\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tzhjs\" (UID: \"c8595599-9969-4c2e-bc3a-f2ff038d8c11\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tzhjs" Oct 03 15:12:30 crc kubenswrapper[4774]: I1003 15:12:30.937424 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tzhjs" Oct 03 15:12:31 crc kubenswrapper[4774]: I1003 15:12:31.469048 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-tzhjs"] Oct 03 15:12:31 crc kubenswrapper[4774]: I1003 15:12:31.550568 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tzhjs" event={"ID":"c8595599-9969-4c2e-bc3a-f2ff038d8c11","Type":"ContainerStarted","Data":"07a8cd6a1d6c67584308aa19e346e3e664f448ad71c7f5cc69b825be8a8b7b8d"} Oct 03 15:12:32 crc kubenswrapper[4774]: I1003 15:12:32.558612 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tzhjs" event={"ID":"c8595599-9969-4c2e-bc3a-f2ff038d8c11","Type":"ContainerStarted","Data":"c13b877091443574b859556d5dc2a5141f1f91952fc066966a7f83aa0d5391b8"} Oct 03 15:12:32 crc kubenswrapper[4774]: I1003 15:12:32.578403 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tzhjs" podStartSLOduration=1.955133133 podStartE2EDuration="2.578360141s" podCreationTimestamp="2025-10-03 15:12:30 +0000 UTC" firstStartedPulling="2025-10-03 15:12:31.478094074 +0000 UTC m=+1774.067297536" lastFinishedPulling="2025-10-03 15:12:32.101321072 +0000 UTC m=+1774.690524544" observedRunningTime="2025-10-03 15:12:32.571539101 +0000 UTC m=+1775.160742553" watchObservedRunningTime="2025-10-03 15:12:32.578360141 +0000 UTC m=+1775.167563603" Oct 03 15:12:37 crc kubenswrapper[4774]: I1003 15:12:37.053721 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wpkth"] Oct 03 15:12:37 crc kubenswrapper[4774]: I1003 15:12:37.069853 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wpkth"] Oct 03 15:12:37 crc kubenswrapper[4774]: I1003 15:12:37.310614 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c24e835e-fdf1-44ec-ad96-3b54ab88253e" path="/var/lib/kubelet/pods/c24e835e-fdf1-44ec-ad96-3b54ab88253e/volumes" Oct 03 15:12:40 crc kubenswrapper[4774]: I1003 15:12:40.299655 4774 scope.go:117] "RemoveContainer" containerID="74d4d4fd29e66b3f46bed47cb8482d6e253bf183444fa990f8669734114e0846" Oct 03 15:12:40 crc kubenswrapper[4774]: E1003 15:12:40.300426 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:12:51 crc kubenswrapper[4774]: I1003 15:12:51.300316 4774 scope.go:117] "RemoveContainer" containerID="74d4d4fd29e66b3f46bed47cb8482d6e253bf183444fa990f8669734114e0846" Oct 03 15:12:51 crc kubenswrapper[4774]: E1003 15:12:51.301641 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:13:00 crc kubenswrapper[4774]: I1003 15:13:00.058850 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-5fb4f"] Oct 03 15:13:00 crc kubenswrapper[4774]: I1003 15:13:00.071275 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-5fb4f"] Oct 03 15:13:01 crc kubenswrapper[4774]: I1003 15:13:01.317501 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77049c85-7aed-49f4-8dff-4a9a7a3a6b06" path="/var/lib/kubelet/pods/77049c85-7aed-49f4-8dff-4a9a7a3a6b06/volumes" Oct 03 15:13:02 crc kubenswrapper[4774]: I1003 15:13:02.300415 4774 scope.go:117] "RemoveContainer" containerID="74d4d4fd29e66b3f46bed47cb8482d6e253bf183444fa990f8669734114e0846" Oct 03 15:13:02 crc kubenswrapper[4774]: E1003 15:13:02.300836 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:13:04 crc kubenswrapper[4774]: I1003 15:13:04.042414 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zsswn"] Oct 03 15:13:04 crc kubenswrapper[4774]: I1003 15:13:04.052150 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zsswn"] Oct 03 15:13:05 crc kubenswrapper[4774]: I1003 15:13:05.323732 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c7e28d1-6897-4f5f-ad56-6036055365ad" path="/var/lib/kubelet/pods/6c7e28d1-6897-4f5f-ad56-6036055365ad/volumes" Oct 03 15:13:07 crc kubenswrapper[4774]: I1003 15:13:07.923544 4774 generic.go:334] "Generic (PLEG): container finished" podID="c8595599-9969-4c2e-bc3a-f2ff038d8c11" containerID="c13b877091443574b859556d5dc2a5141f1f91952fc066966a7f83aa0d5391b8" exitCode=0 Oct 03 15:13:07 crc kubenswrapper[4774]: I1003 15:13:07.923640 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tzhjs" event={"ID":"c8595599-9969-4c2e-bc3a-f2ff038d8c11","Type":"ContainerDied","Data":"c13b877091443574b859556d5dc2a5141f1f91952fc066966a7f83aa0d5391b8"} Oct 03 15:13:09 crc kubenswrapper[4774]: I1003 15:13:09.352909 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tzhjs" Oct 03 15:13:09 crc kubenswrapper[4774]: I1003 15:13:09.473050 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kw2d\" (UniqueName: \"kubernetes.io/projected/c8595599-9969-4c2e-bc3a-f2ff038d8c11-kube-api-access-2kw2d\") pod \"c8595599-9969-4c2e-bc3a-f2ff038d8c11\" (UID: \"c8595599-9969-4c2e-bc3a-f2ff038d8c11\") " Oct 03 15:13:09 crc kubenswrapper[4774]: I1003 15:13:09.473145 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8595599-9969-4c2e-bc3a-f2ff038d8c11-inventory\") pod \"c8595599-9969-4c2e-bc3a-f2ff038d8c11\" (UID: \"c8595599-9969-4c2e-bc3a-f2ff038d8c11\") " Oct 03 15:13:09 crc kubenswrapper[4774]: I1003 15:13:09.473179 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c8595599-9969-4c2e-bc3a-f2ff038d8c11-ssh-key\") pod \"c8595599-9969-4c2e-bc3a-f2ff038d8c11\" (UID: \"c8595599-9969-4c2e-bc3a-f2ff038d8c11\") " Oct 03 15:13:09 crc kubenswrapper[4774]: I1003 15:13:09.480657 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8595599-9969-4c2e-bc3a-f2ff038d8c11-kube-api-access-2kw2d" (OuterVolumeSpecName: "kube-api-access-2kw2d") pod "c8595599-9969-4c2e-bc3a-f2ff038d8c11" (UID: "c8595599-9969-4c2e-bc3a-f2ff038d8c11"). InnerVolumeSpecName "kube-api-access-2kw2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:13:09 crc kubenswrapper[4774]: I1003 15:13:09.506629 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8595599-9969-4c2e-bc3a-f2ff038d8c11-inventory" (OuterVolumeSpecName: "inventory") pod "c8595599-9969-4c2e-bc3a-f2ff038d8c11" (UID: "c8595599-9969-4c2e-bc3a-f2ff038d8c11"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:13:09 crc kubenswrapper[4774]: I1003 15:13:09.517931 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8595599-9969-4c2e-bc3a-f2ff038d8c11-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c8595599-9969-4c2e-bc3a-f2ff038d8c11" (UID: "c8595599-9969-4c2e-bc3a-f2ff038d8c11"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:13:09 crc kubenswrapper[4774]: I1003 15:13:09.575757 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kw2d\" (UniqueName: \"kubernetes.io/projected/c8595599-9969-4c2e-bc3a-f2ff038d8c11-kube-api-access-2kw2d\") on node \"crc\" DevicePath \"\"" Oct 03 15:13:09 crc kubenswrapper[4774]: I1003 15:13:09.575789 4774 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8595599-9969-4c2e-bc3a-f2ff038d8c11-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 15:13:09 crc kubenswrapper[4774]: I1003 15:13:09.575798 4774 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c8595599-9969-4c2e-bc3a-f2ff038d8c11-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 15:13:09 crc kubenswrapper[4774]: I1003 15:13:09.949097 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tzhjs" event={"ID":"c8595599-9969-4c2e-bc3a-f2ff038d8c11","Type":"ContainerDied","Data":"07a8cd6a1d6c67584308aa19e346e3e664f448ad71c7f5cc69b825be8a8b7b8d"} Oct 03 15:13:09 crc kubenswrapper[4774]: I1003 15:13:09.949151 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07a8cd6a1d6c67584308aa19e346e3e664f448ad71c7f5cc69b825be8a8b7b8d" Oct 03 15:13:09 crc kubenswrapper[4774]: I1003 15:13:09.949155 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tzhjs" Oct 03 15:13:10 crc kubenswrapper[4774]: I1003 15:13:10.096782 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-97dwl"] Oct 03 15:13:10 crc kubenswrapper[4774]: E1003 15:13:10.097541 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8595599-9969-4c2e-bc3a-f2ff038d8c11" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 03 15:13:10 crc kubenswrapper[4774]: I1003 15:13:10.097567 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8595599-9969-4c2e-bc3a-f2ff038d8c11" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 03 15:13:10 crc kubenswrapper[4774]: I1003 15:13:10.097839 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8595599-9969-4c2e-bc3a-f2ff038d8c11" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 03 15:13:10 crc kubenswrapper[4774]: I1003 15:13:10.098624 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-97dwl" Oct 03 15:13:10 crc kubenswrapper[4774]: I1003 15:13:10.101842 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 15:13:10 crc kubenswrapper[4774]: I1003 15:13:10.101852 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 15:13:10 crc kubenswrapper[4774]: I1003 15:13:10.102164 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7bdzq" Oct 03 15:13:10 crc kubenswrapper[4774]: I1003 15:13:10.105669 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 15:13:10 crc kubenswrapper[4774]: I1003 15:13:10.112876 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-97dwl"] Oct 03 15:13:10 crc kubenswrapper[4774]: I1003 15:13:10.186593 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fddgs\" (UniqueName: \"kubernetes.io/projected/365904b3-7404-4fe1-a9bf-c2711b345c08-kube-api-access-fddgs\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-97dwl\" (UID: \"365904b3-7404-4fe1-a9bf-c2711b345c08\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-97dwl" Oct 03 15:13:10 crc kubenswrapper[4774]: I1003 15:13:10.186679 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/365904b3-7404-4fe1-a9bf-c2711b345c08-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-97dwl\" (UID: \"365904b3-7404-4fe1-a9bf-c2711b345c08\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-97dwl" Oct 03 15:13:10 crc kubenswrapper[4774]: I1003 15:13:10.186815 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/365904b3-7404-4fe1-a9bf-c2711b345c08-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-97dwl\" (UID: \"365904b3-7404-4fe1-a9bf-c2711b345c08\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-97dwl" Oct 03 15:13:10 crc kubenswrapper[4774]: I1003 15:13:10.289417 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fddgs\" (UniqueName: \"kubernetes.io/projected/365904b3-7404-4fe1-a9bf-c2711b345c08-kube-api-access-fddgs\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-97dwl\" (UID: \"365904b3-7404-4fe1-a9bf-c2711b345c08\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-97dwl" Oct 03 15:13:10 crc kubenswrapper[4774]: I1003 15:13:10.289518 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/365904b3-7404-4fe1-a9bf-c2711b345c08-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-97dwl\" (UID: \"365904b3-7404-4fe1-a9bf-c2711b345c08\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-97dwl" Oct 03 15:13:10 crc kubenswrapper[4774]: I1003 15:13:10.289621 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/365904b3-7404-4fe1-a9bf-c2711b345c08-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-97dwl\" (UID: \"365904b3-7404-4fe1-a9bf-c2711b345c08\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-97dwl" Oct 03 15:13:10 crc kubenswrapper[4774]: I1003 15:13:10.294571 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/365904b3-7404-4fe1-a9bf-c2711b345c08-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-97dwl\" (UID: \"365904b3-7404-4fe1-a9bf-c2711b345c08\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-97dwl" Oct 03 15:13:10 crc kubenswrapper[4774]: I1003 15:13:10.295956 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/365904b3-7404-4fe1-a9bf-c2711b345c08-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-97dwl\" (UID: \"365904b3-7404-4fe1-a9bf-c2711b345c08\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-97dwl" Oct 03 15:13:10 crc kubenswrapper[4774]: I1003 15:13:10.311182 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fddgs\" (UniqueName: \"kubernetes.io/projected/365904b3-7404-4fe1-a9bf-c2711b345c08-kube-api-access-fddgs\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-97dwl\" (UID: \"365904b3-7404-4fe1-a9bf-c2711b345c08\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-97dwl" Oct 03 15:13:10 crc kubenswrapper[4774]: I1003 15:13:10.417981 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-97dwl" Oct 03 15:13:10 crc kubenswrapper[4774]: I1003 15:13:10.968290 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-97dwl"] Oct 03 15:13:11 crc kubenswrapper[4774]: I1003 15:13:11.981913 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-97dwl" event={"ID":"365904b3-7404-4fe1-a9bf-c2711b345c08","Type":"ContainerStarted","Data":"4a3b4edbbeeb9b669794d5959d7297aca2325aad5a7e80d2b6c102bdd528f197"} Oct 03 15:13:11 crc kubenswrapper[4774]: I1003 15:13:11.982487 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-97dwl" event={"ID":"365904b3-7404-4fe1-a9bf-c2711b345c08","Type":"ContainerStarted","Data":"c1402fbba567d026e94e8ae204affa011181ebb0ada6512e4babff49a3a8fc18"} Oct 03 15:13:12 crc kubenswrapper[4774]: I1003 15:13:12.002963 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-97dwl" podStartSLOduration=1.482347056 podStartE2EDuration="2.002937936s" podCreationTimestamp="2025-10-03 15:13:10 +0000 UTC" firstStartedPulling="2025-10-03 15:13:10.972259837 +0000 UTC m=+1813.561463289" lastFinishedPulling="2025-10-03 15:13:11.492850687 +0000 UTC m=+1814.082054169" observedRunningTime="2025-10-03 15:13:11.995909471 +0000 UTC m=+1814.585112923" watchObservedRunningTime="2025-10-03 15:13:12.002937936 +0000 UTC m=+1814.592141388" Oct 03 15:13:13 crc kubenswrapper[4774]: I1003 15:13:13.300931 4774 scope.go:117] "RemoveContainer" containerID="74d4d4fd29e66b3f46bed47cb8482d6e253bf183444fa990f8669734114e0846" Oct 03 15:13:13 crc kubenswrapper[4774]: E1003 15:13:13.301609 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:13:24 crc kubenswrapper[4774]: I1003 15:13:24.299366 4774 scope.go:117] "RemoveContainer" containerID="74d4d4fd29e66b3f46bed47cb8482d6e253bf183444fa990f8669734114e0846" Oct 03 15:13:24 crc kubenswrapper[4774]: E1003 15:13:24.300288 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:13:25 crc kubenswrapper[4774]: I1003 15:13:25.996899 4774 scope.go:117] "RemoveContainer" containerID="125da0755aa46a51b1d2761dd1db9566f71c631bfdd9f55ac18319b30da33858" Oct 03 15:13:26 crc kubenswrapper[4774]: I1003 15:13:26.065950 4774 scope.go:117] "RemoveContainer" containerID="66c704702b10730f4806d6cc857a5fb52bea144f753d5d46ba8e711624b7b866" Oct 03 15:13:26 crc kubenswrapper[4774]: I1003 15:13:26.105966 4774 scope.go:117] "RemoveContainer" containerID="ab49b1b43e1afd447c085cc1fb1a827df7e4610d24b0645fb5af172b3d241d8b" Oct 03 15:13:38 crc kubenswrapper[4774]: I1003 15:13:38.301324 4774 scope.go:117] "RemoveContainer" containerID="74d4d4fd29e66b3f46bed47cb8482d6e253bf183444fa990f8669734114e0846" Oct 03 15:13:38 crc kubenswrapper[4774]: E1003 15:13:38.302845 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:13:48 crc kubenswrapper[4774]: I1003 15:13:48.063482 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-8qhs9"] Oct 03 15:13:48 crc kubenswrapper[4774]: I1003 15:13:48.074109 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-8qhs9"] Oct 03 15:13:49 crc kubenswrapper[4774]: I1003 15:13:49.327148 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5abc11a-7d66-4906-9629-0ce6a1dd5264" path="/var/lib/kubelet/pods/c5abc11a-7d66-4906-9629-0ce6a1dd5264/volumes" Oct 03 15:13:50 crc kubenswrapper[4774]: I1003 15:13:50.300736 4774 scope.go:117] "RemoveContainer" containerID="74d4d4fd29e66b3f46bed47cb8482d6e253bf183444fa990f8669734114e0846" Oct 03 15:13:50 crc kubenswrapper[4774]: E1003 15:13:50.301261 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:14:03 crc kubenswrapper[4774]: I1003 15:14:03.301479 4774 scope.go:117] "RemoveContainer" containerID="74d4d4fd29e66b3f46bed47cb8482d6e253bf183444fa990f8669734114e0846" Oct 03 15:14:03 crc kubenswrapper[4774]: E1003 15:14:03.305807 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:14:07 crc kubenswrapper[4774]: I1003 15:14:07.568981 4774 generic.go:334] "Generic (PLEG): container finished" podID="365904b3-7404-4fe1-a9bf-c2711b345c08" containerID="4a3b4edbbeeb9b669794d5959d7297aca2325aad5a7e80d2b6c102bdd528f197" exitCode=2 Oct 03 15:14:07 crc kubenswrapper[4774]: I1003 15:14:07.569110 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-97dwl" event={"ID":"365904b3-7404-4fe1-a9bf-c2711b345c08","Type":"ContainerDied","Data":"4a3b4edbbeeb9b669794d5959d7297aca2325aad5a7e80d2b6c102bdd528f197"} Oct 03 15:14:09 crc kubenswrapper[4774]: I1003 15:14:09.034519 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-97dwl" Oct 03 15:14:09 crc kubenswrapper[4774]: I1003 15:14:09.166934 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/365904b3-7404-4fe1-a9bf-c2711b345c08-ssh-key\") pod \"365904b3-7404-4fe1-a9bf-c2711b345c08\" (UID: \"365904b3-7404-4fe1-a9bf-c2711b345c08\") " Oct 03 15:14:09 crc kubenswrapper[4774]: I1003 15:14:09.167014 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/365904b3-7404-4fe1-a9bf-c2711b345c08-inventory\") pod \"365904b3-7404-4fe1-a9bf-c2711b345c08\" (UID: \"365904b3-7404-4fe1-a9bf-c2711b345c08\") " Oct 03 15:14:09 crc kubenswrapper[4774]: I1003 15:14:09.167140 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fddgs\" (UniqueName: \"kubernetes.io/projected/365904b3-7404-4fe1-a9bf-c2711b345c08-kube-api-access-fddgs\") pod \"365904b3-7404-4fe1-a9bf-c2711b345c08\" (UID: \"365904b3-7404-4fe1-a9bf-c2711b345c08\") " Oct 03 15:14:09 crc kubenswrapper[4774]: I1003 15:14:09.179069 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/365904b3-7404-4fe1-a9bf-c2711b345c08-kube-api-access-fddgs" (OuterVolumeSpecName: "kube-api-access-fddgs") pod "365904b3-7404-4fe1-a9bf-c2711b345c08" (UID: "365904b3-7404-4fe1-a9bf-c2711b345c08"). InnerVolumeSpecName "kube-api-access-fddgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:14:09 crc kubenswrapper[4774]: I1003 15:14:09.200005 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/365904b3-7404-4fe1-a9bf-c2711b345c08-inventory" (OuterVolumeSpecName: "inventory") pod "365904b3-7404-4fe1-a9bf-c2711b345c08" (UID: "365904b3-7404-4fe1-a9bf-c2711b345c08"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:14:09 crc kubenswrapper[4774]: I1003 15:14:09.211663 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/365904b3-7404-4fe1-a9bf-c2711b345c08-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "365904b3-7404-4fe1-a9bf-c2711b345c08" (UID: "365904b3-7404-4fe1-a9bf-c2711b345c08"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:14:09 crc kubenswrapper[4774]: I1003 15:14:09.269585 4774 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/365904b3-7404-4fe1-a9bf-c2711b345c08-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 15:14:09 crc kubenswrapper[4774]: I1003 15:14:09.269643 4774 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/365904b3-7404-4fe1-a9bf-c2711b345c08-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 15:14:09 crc kubenswrapper[4774]: I1003 15:14:09.269666 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fddgs\" (UniqueName: \"kubernetes.io/projected/365904b3-7404-4fe1-a9bf-c2711b345c08-kube-api-access-fddgs\") on node \"crc\" DevicePath \"\"" Oct 03 15:14:09 crc kubenswrapper[4774]: I1003 15:14:09.590784 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-97dwl" event={"ID":"365904b3-7404-4fe1-a9bf-c2711b345c08","Type":"ContainerDied","Data":"c1402fbba567d026e94e8ae204affa011181ebb0ada6512e4babff49a3a8fc18"} Oct 03 15:14:09 crc kubenswrapper[4774]: I1003 15:14:09.591104 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1402fbba567d026e94e8ae204affa011181ebb0ada6512e4babff49a3a8fc18" Oct 03 15:14:09 crc kubenswrapper[4774]: I1003 15:14:09.590880 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-97dwl" Oct 03 15:14:16 crc kubenswrapper[4774]: I1003 15:14:16.033738 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6shkp"] Oct 03 15:14:16 crc kubenswrapper[4774]: E1003 15:14:16.034866 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="365904b3-7404-4fe1-a9bf-c2711b345c08" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 03 15:14:16 crc kubenswrapper[4774]: I1003 15:14:16.034885 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="365904b3-7404-4fe1-a9bf-c2711b345c08" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 03 15:14:16 crc kubenswrapper[4774]: I1003 15:14:16.035166 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="365904b3-7404-4fe1-a9bf-c2711b345c08" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 03 15:14:16 crc kubenswrapper[4774]: I1003 15:14:16.035958 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6shkp" Oct 03 15:14:16 crc kubenswrapper[4774]: I1003 15:14:16.038234 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 15:14:16 crc kubenswrapper[4774]: I1003 15:14:16.038344 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 15:14:16 crc kubenswrapper[4774]: I1003 15:14:16.038515 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7bdzq" Oct 03 15:14:16 crc kubenswrapper[4774]: I1003 15:14:16.040203 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 15:14:16 crc kubenswrapper[4774]: I1003 15:14:16.045302 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6shkp"] Oct 03 15:14:16 crc kubenswrapper[4774]: I1003 15:14:16.096600 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9522d412-aaac-4917-86a0-2d9c40830b8d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6shkp\" (UID: \"9522d412-aaac-4917-86a0-2d9c40830b8d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6shkp" Oct 03 15:14:16 crc kubenswrapper[4774]: I1003 15:14:16.096756 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft4fr\" (UniqueName: \"kubernetes.io/projected/9522d412-aaac-4917-86a0-2d9c40830b8d-kube-api-access-ft4fr\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6shkp\" (UID: \"9522d412-aaac-4917-86a0-2d9c40830b8d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6shkp" Oct 03 15:14:16 crc kubenswrapper[4774]: I1003 15:14:16.097106 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9522d412-aaac-4917-86a0-2d9c40830b8d-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6shkp\" (UID: \"9522d412-aaac-4917-86a0-2d9c40830b8d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6shkp" Oct 03 15:14:16 crc kubenswrapper[4774]: I1003 15:14:16.199262 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9522d412-aaac-4917-86a0-2d9c40830b8d-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6shkp\" (UID: \"9522d412-aaac-4917-86a0-2d9c40830b8d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6shkp" Oct 03 15:14:16 crc kubenswrapper[4774]: I1003 15:14:16.199404 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9522d412-aaac-4917-86a0-2d9c40830b8d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6shkp\" (UID: \"9522d412-aaac-4917-86a0-2d9c40830b8d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6shkp" Oct 03 15:14:16 crc kubenswrapper[4774]: I1003 15:14:16.199459 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft4fr\" (UniqueName: \"kubernetes.io/projected/9522d412-aaac-4917-86a0-2d9c40830b8d-kube-api-access-ft4fr\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6shkp\" (UID: \"9522d412-aaac-4917-86a0-2d9c40830b8d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6shkp" Oct 03 15:14:16 crc kubenswrapper[4774]: I1003 15:14:16.223235 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9522d412-aaac-4917-86a0-2d9c40830b8d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6shkp\" (UID: \"9522d412-aaac-4917-86a0-2d9c40830b8d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6shkp" Oct 03 15:14:16 crc kubenswrapper[4774]: I1003 15:14:16.227930 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9522d412-aaac-4917-86a0-2d9c40830b8d-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6shkp\" (UID: \"9522d412-aaac-4917-86a0-2d9c40830b8d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6shkp" Oct 03 15:14:16 crc kubenswrapper[4774]: I1003 15:14:16.228909 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft4fr\" (UniqueName: \"kubernetes.io/projected/9522d412-aaac-4917-86a0-2d9c40830b8d-kube-api-access-ft4fr\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6shkp\" (UID: \"9522d412-aaac-4917-86a0-2d9c40830b8d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6shkp" Oct 03 15:14:16 crc kubenswrapper[4774]: I1003 15:14:16.363798 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6shkp" Oct 03 15:14:16 crc kubenswrapper[4774]: I1003 15:14:16.897205 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6shkp"] Oct 03 15:14:17 crc kubenswrapper[4774]: I1003 15:14:17.301153 4774 scope.go:117] "RemoveContainer" containerID="74d4d4fd29e66b3f46bed47cb8482d6e253bf183444fa990f8669734114e0846" Oct 03 15:14:17 crc kubenswrapper[4774]: E1003 15:14:17.301710 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:14:17 crc kubenswrapper[4774]: I1003 15:14:17.673502 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6shkp" event={"ID":"9522d412-aaac-4917-86a0-2d9c40830b8d","Type":"ContainerStarted","Data":"7e176265180955054ccc45560fb85d3045f6799bc7973bce985c1317d5fb4bf1"} Oct 03 15:14:18 crc kubenswrapper[4774]: I1003 15:14:18.688013 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6shkp" event={"ID":"9522d412-aaac-4917-86a0-2d9c40830b8d","Type":"ContainerStarted","Data":"a649c2026383becfd310f8bb743eb37603c5093ac1a3c2ae423a91f08accaf7c"} Oct 03 15:14:18 crc kubenswrapper[4774]: I1003 15:14:18.714843 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6shkp" podStartSLOduration=2.145400014 podStartE2EDuration="2.714815186s" podCreationTimestamp="2025-10-03 15:14:16 +0000 UTC" firstStartedPulling="2025-10-03 15:14:16.902842352 +0000 UTC m=+1879.492045824" lastFinishedPulling="2025-10-03 15:14:17.472257544 +0000 UTC m=+1880.061460996" observedRunningTime="2025-10-03 15:14:18.709819152 +0000 UTC m=+1881.299022644" watchObservedRunningTime="2025-10-03 15:14:18.714815186 +0000 UTC m=+1881.304018678" Oct 03 15:14:26 crc kubenswrapper[4774]: I1003 15:14:26.200932 4774 scope.go:117] "RemoveContainer" containerID="da3d46aff0ef3b843d744709380108d10fec0a26379fd469ffab543b18ba3ba9" Oct 03 15:14:32 crc kubenswrapper[4774]: I1003 15:14:32.299626 4774 scope.go:117] "RemoveContainer" containerID="74d4d4fd29e66b3f46bed47cb8482d6e253bf183444fa990f8669734114e0846" Oct 03 15:14:32 crc kubenswrapper[4774]: I1003 15:14:32.822712 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" event={"ID":"ca37ac4b-f421-4198-a179-12901d36f0f5","Type":"ContainerStarted","Data":"7f89eaf64e7c7bc9ea3287549d0d73eb381c82bb1cc017882eb6cc10db2b2164"} Oct 03 15:15:00 crc kubenswrapper[4774]: I1003 15:15:00.151280 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325075-8d55c"] Oct 03 15:15:00 crc kubenswrapper[4774]: I1003 15:15:00.153032 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-8d55c" Oct 03 15:15:00 crc kubenswrapper[4774]: I1003 15:15:00.156226 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 15:15:00 crc kubenswrapper[4774]: I1003 15:15:00.156807 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 15:15:00 crc kubenswrapper[4774]: I1003 15:15:00.171597 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325075-8d55c"] Oct 03 15:15:00 crc kubenswrapper[4774]: I1003 15:15:00.275391 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sstjt\" (UniqueName: \"kubernetes.io/projected/6434840a-0acd-4aae-b1e8-75c9ec59441e-kube-api-access-sstjt\") pod \"collect-profiles-29325075-8d55c\" (UID: \"6434840a-0acd-4aae-b1e8-75c9ec59441e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-8d55c" Oct 03 15:15:00 crc kubenswrapper[4774]: I1003 15:15:00.275561 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6434840a-0acd-4aae-b1e8-75c9ec59441e-secret-volume\") pod \"collect-profiles-29325075-8d55c\" (UID: \"6434840a-0acd-4aae-b1e8-75c9ec59441e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-8d55c" Oct 03 15:15:00 crc kubenswrapper[4774]: I1003 15:15:00.275579 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6434840a-0acd-4aae-b1e8-75c9ec59441e-config-volume\") pod \"collect-profiles-29325075-8d55c\" (UID: \"6434840a-0acd-4aae-b1e8-75c9ec59441e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-8d55c" Oct 03 15:15:00 crc kubenswrapper[4774]: I1003 15:15:00.378218 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6434840a-0acd-4aae-b1e8-75c9ec59441e-secret-volume\") pod \"collect-profiles-29325075-8d55c\" (UID: \"6434840a-0acd-4aae-b1e8-75c9ec59441e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-8d55c" Oct 03 15:15:00 crc kubenswrapper[4774]: I1003 15:15:00.378646 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6434840a-0acd-4aae-b1e8-75c9ec59441e-config-volume\") pod \"collect-profiles-29325075-8d55c\" (UID: \"6434840a-0acd-4aae-b1e8-75c9ec59441e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-8d55c" Oct 03 15:15:00 crc kubenswrapper[4774]: I1003 15:15:00.378881 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sstjt\" (UniqueName: \"kubernetes.io/projected/6434840a-0acd-4aae-b1e8-75c9ec59441e-kube-api-access-sstjt\") pod \"collect-profiles-29325075-8d55c\" (UID: \"6434840a-0acd-4aae-b1e8-75c9ec59441e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-8d55c" Oct 03 15:15:00 crc kubenswrapper[4774]: I1003 15:15:00.380749 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6434840a-0acd-4aae-b1e8-75c9ec59441e-config-volume\") pod \"collect-profiles-29325075-8d55c\" (UID: \"6434840a-0acd-4aae-b1e8-75c9ec59441e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-8d55c" Oct 03 15:15:00 crc kubenswrapper[4774]: I1003 15:15:00.387247 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6434840a-0acd-4aae-b1e8-75c9ec59441e-secret-volume\") pod \"collect-profiles-29325075-8d55c\" (UID: \"6434840a-0acd-4aae-b1e8-75c9ec59441e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-8d55c" Oct 03 15:15:00 crc kubenswrapper[4774]: I1003 15:15:00.398338 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sstjt\" (UniqueName: \"kubernetes.io/projected/6434840a-0acd-4aae-b1e8-75c9ec59441e-kube-api-access-sstjt\") pod \"collect-profiles-29325075-8d55c\" (UID: \"6434840a-0acd-4aae-b1e8-75c9ec59441e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-8d55c" Oct 03 15:15:00 crc kubenswrapper[4774]: I1003 15:15:00.471711 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-8d55c" Oct 03 15:15:01 crc kubenswrapper[4774]: I1003 15:15:01.044831 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325075-8d55c"] Oct 03 15:15:01 crc kubenswrapper[4774]: I1003 15:15:01.117694 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-8d55c" event={"ID":"6434840a-0acd-4aae-b1e8-75c9ec59441e","Type":"ContainerStarted","Data":"b7fc1268b76caf0fae9a11e5f290b601cf0db5b65cecdf0bb630b028d5134871"} Oct 03 15:15:02 crc kubenswrapper[4774]: I1003 15:15:02.130446 4774 generic.go:334] "Generic (PLEG): container finished" podID="6434840a-0acd-4aae-b1e8-75c9ec59441e" containerID="e3ee2b09e61fea829d0116d6df6db37b722bff7f3ffb86b10b6fb098d5b90778" exitCode=0 Oct 03 15:15:02 crc kubenswrapper[4774]: I1003 15:15:02.130580 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-8d55c" event={"ID":"6434840a-0acd-4aae-b1e8-75c9ec59441e","Type":"ContainerDied","Data":"e3ee2b09e61fea829d0116d6df6db37b722bff7f3ffb86b10b6fb098d5b90778"} Oct 03 15:15:03 crc kubenswrapper[4774]: I1003 15:15:03.563899 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-8d55c" Oct 03 15:15:03 crc kubenswrapper[4774]: I1003 15:15:03.652634 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6434840a-0acd-4aae-b1e8-75c9ec59441e-secret-volume\") pod \"6434840a-0acd-4aae-b1e8-75c9ec59441e\" (UID: \"6434840a-0acd-4aae-b1e8-75c9ec59441e\") " Oct 03 15:15:03 crc kubenswrapper[4774]: I1003 15:15:03.653135 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6434840a-0acd-4aae-b1e8-75c9ec59441e-config-volume\") pod \"6434840a-0acd-4aae-b1e8-75c9ec59441e\" (UID: \"6434840a-0acd-4aae-b1e8-75c9ec59441e\") " Oct 03 15:15:03 crc kubenswrapper[4774]: I1003 15:15:03.653204 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sstjt\" (UniqueName: \"kubernetes.io/projected/6434840a-0acd-4aae-b1e8-75c9ec59441e-kube-api-access-sstjt\") pod \"6434840a-0acd-4aae-b1e8-75c9ec59441e\" (UID: \"6434840a-0acd-4aae-b1e8-75c9ec59441e\") " Oct 03 15:15:03 crc kubenswrapper[4774]: I1003 15:15:03.653906 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6434840a-0acd-4aae-b1e8-75c9ec59441e-config-volume" (OuterVolumeSpecName: "config-volume") pod "6434840a-0acd-4aae-b1e8-75c9ec59441e" (UID: "6434840a-0acd-4aae-b1e8-75c9ec59441e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:15:03 crc kubenswrapper[4774]: I1003 15:15:03.659438 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6434840a-0acd-4aae-b1e8-75c9ec59441e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6434840a-0acd-4aae-b1e8-75c9ec59441e" (UID: "6434840a-0acd-4aae-b1e8-75c9ec59441e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:15:03 crc kubenswrapper[4774]: I1003 15:15:03.659602 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6434840a-0acd-4aae-b1e8-75c9ec59441e-kube-api-access-sstjt" (OuterVolumeSpecName: "kube-api-access-sstjt") pod "6434840a-0acd-4aae-b1e8-75c9ec59441e" (UID: "6434840a-0acd-4aae-b1e8-75c9ec59441e"). InnerVolumeSpecName "kube-api-access-sstjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:15:03 crc kubenswrapper[4774]: I1003 15:15:03.756307 4774 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6434840a-0acd-4aae-b1e8-75c9ec59441e-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 15:15:03 crc kubenswrapper[4774]: I1003 15:15:03.756427 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sstjt\" (UniqueName: \"kubernetes.io/projected/6434840a-0acd-4aae-b1e8-75c9ec59441e-kube-api-access-sstjt\") on node \"crc\" DevicePath \"\"" Oct 03 15:15:03 crc kubenswrapper[4774]: I1003 15:15:03.756461 4774 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6434840a-0acd-4aae-b1e8-75c9ec59441e-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 15:15:04 crc kubenswrapper[4774]: I1003 15:15:04.152958 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-8d55c" event={"ID":"6434840a-0acd-4aae-b1e8-75c9ec59441e","Type":"ContainerDied","Data":"b7fc1268b76caf0fae9a11e5f290b601cf0db5b65cecdf0bb630b028d5134871"} Oct 03 15:15:04 crc kubenswrapper[4774]: I1003 15:15:04.153009 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7fc1268b76caf0fae9a11e5f290b601cf0db5b65cecdf0bb630b028d5134871" Oct 03 15:15:04 crc kubenswrapper[4774]: I1003 15:15:04.153073 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-8d55c" Oct 03 15:15:04 crc kubenswrapper[4774]: I1003 15:15:04.157807 4774 generic.go:334] "Generic (PLEG): container finished" podID="9522d412-aaac-4917-86a0-2d9c40830b8d" containerID="a649c2026383becfd310f8bb743eb37603c5093ac1a3c2ae423a91f08accaf7c" exitCode=0 Oct 03 15:15:04 crc kubenswrapper[4774]: I1003 15:15:04.157918 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6shkp" event={"ID":"9522d412-aaac-4917-86a0-2d9c40830b8d","Type":"ContainerDied","Data":"a649c2026383becfd310f8bb743eb37603c5093ac1a3c2ae423a91f08accaf7c"} Oct 03 15:15:05 crc kubenswrapper[4774]: I1003 15:15:05.673957 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6shkp" Oct 03 15:15:05 crc kubenswrapper[4774]: I1003 15:15:05.797048 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9522d412-aaac-4917-86a0-2d9c40830b8d-ssh-key\") pod \"9522d412-aaac-4917-86a0-2d9c40830b8d\" (UID: \"9522d412-aaac-4917-86a0-2d9c40830b8d\") " Oct 03 15:15:05 crc kubenswrapper[4774]: I1003 15:15:05.797147 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9522d412-aaac-4917-86a0-2d9c40830b8d-inventory\") pod \"9522d412-aaac-4917-86a0-2d9c40830b8d\" (UID: \"9522d412-aaac-4917-86a0-2d9c40830b8d\") " Oct 03 15:15:05 crc kubenswrapper[4774]: I1003 15:15:05.797220 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ft4fr\" (UniqueName: \"kubernetes.io/projected/9522d412-aaac-4917-86a0-2d9c40830b8d-kube-api-access-ft4fr\") pod \"9522d412-aaac-4917-86a0-2d9c40830b8d\" (UID: \"9522d412-aaac-4917-86a0-2d9c40830b8d\") " Oct 03 15:15:05 crc kubenswrapper[4774]: I1003 15:15:05.804641 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9522d412-aaac-4917-86a0-2d9c40830b8d-kube-api-access-ft4fr" (OuterVolumeSpecName: "kube-api-access-ft4fr") pod "9522d412-aaac-4917-86a0-2d9c40830b8d" (UID: "9522d412-aaac-4917-86a0-2d9c40830b8d"). InnerVolumeSpecName "kube-api-access-ft4fr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:15:05 crc kubenswrapper[4774]: I1003 15:15:05.829560 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9522d412-aaac-4917-86a0-2d9c40830b8d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9522d412-aaac-4917-86a0-2d9c40830b8d" (UID: "9522d412-aaac-4917-86a0-2d9c40830b8d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:15:05 crc kubenswrapper[4774]: I1003 15:15:05.835993 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9522d412-aaac-4917-86a0-2d9c40830b8d-inventory" (OuterVolumeSpecName: "inventory") pod "9522d412-aaac-4917-86a0-2d9c40830b8d" (UID: "9522d412-aaac-4917-86a0-2d9c40830b8d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:15:05 crc kubenswrapper[4774]: I1003 15:15:05.900312 4774 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9522d412-aaac-4917-86a0-2d9c40830b8d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 15:15:05 crc kubenswrapper[4774]: I1003 15:15:05.900375 4774 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9522d412-aaac-4917-86a0-2d9c40830b8d-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 15:15:05 crc kubenswrapper[4774]: I1003 15:15:05.900417 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ft4fr\" (UniqueName: \"kubernetes.io/projected/9522d412-aaac-4917-86a0-2d9c40830b8d-kube-api-access-ft4fr\") on node \"crc\" DevicePath \"\"" Oct 03 15:15:06 crc kubenswrapper[4774]: I1003 15:15:06.180921 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6shkp" event={"ID":"9522d412-aaac-4917-86a0-2d9c40830b8d","Type":"ContainerDied","Data":"7e176265180955054ccc45560fb85d3045f6799bc7973bce985c1317d5fb4bf1"} Oct 03 15:15:06 crc kubenswrapper[4774]: I1003 15:15:06.181295 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e176265180955054ccc45560fb85d3045f6799bc7973bce985c1317d5fb4bf1" Oct 03 15:15:06 crc kubenswrapper[4774]: I1003 15:15:06.181019 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6shkp" Oct 03 15:15:06 crc kubenswrapper[4774]: I1003 15:15:06.281522 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-rnpx9"] Oct 03 15:15:06 crc kubenswrapper[4774]: E1003 15:15:06.281915 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9522d412-aaac-4917-86a0-2d9c40830b8d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 03 15:15:06 crc kubenswrapper[4774]: I1003 15:15:06.281937 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="9522d412-aaac-4917-86a0-2d9c40830b8d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 03 15:15:06 crc kubenswrapper[4774]: E1003 15:15:06.281965 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6434840a-0acd-4aae-b1e8-75c9ec59441e" containerName="collect-profiles" Oct 03 15:15:06 crc kubenswrapper[4774]: I1003 15:15:06.281974 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="6434840a-0acd-4aae-b1e8-75c9ec59441e" containerName="collect-profiles" Oct 03 15:15:06 crc kubenswrapper[4774]: I1003 15:15:06.282209 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="9522d412-aaac-4917-86a0-2d9c40830b8d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 03 15:15:06 crc kubenswrapper[4774]: I1003 15:15:06.282231 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="6434840a-0acd-4aae-b1e8-75c9ec59441e" containerName="collect-profiles" Oct 03 15:15:06 crc kubenswrapper[4774]: I1003 15:15:06.282978 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-rnpx9" Oct 03 15:15:06 crc kubenswrapper[4774]: I1003 15:15:06.287302 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 15:15:06 crc kubenswrapper[4774]: I1003 15:15:06.287568 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 15:15:06 crc kubenswrapper[4774]: I1003 15:15:06.288288 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7bdzq" Oct 03 15:15:06 crc kubenswrapper[4774]: I1003 15:15:06.288503 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 15:15:06 crc kubenswrapper[4774]: I1003 15:15:06.296663 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-rnpx9"] Oct 03 15:15:06 crc kubenswrapper[4774]: I1003 15:15:06.414484 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04fa8c76-a20b-4ae9-86f0-fe4801763d0e-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-rnpx9\" (UID: \"04fa8c76-a20b-4ae9-86f0-fe4801763d0e\") " pod="openstack/ssh-known-hosts-edpm-deployment-rnpx9" Oct 03 15:15:06 crc kubenswrapper[4774]: I1003 15:15:06.414558 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/04fa8c76-a20b-4ae9-86f0-fe4801763d0e-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-rnpx9\" (UID: \"04fa8c76-a20b-4ae9-86f0-fe4801763d0e\") " pod="openstack/ssh-known-hosts-edpm-deployment-rnpx9" Oct 03 15:15:06 crc kubenswrapper[4774]: I1003 15:15:06.414930 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25f98\" (UniqueName: \"kubernetes.io/projected/04fa8c76-a20b-4ae9-86f0-fe4801763d0e-kube-api-access-25f98\") pod \"ssh-known-hosts-edpm-deployment-rnpx9\" (UID: \"04fa8c76-a20b-4ae9-86f0-fe4801763d0e\") " pod="openstack/ssh-known-hosts-edpm-deployment-rnpx9" Oct 03 15:15:06 crc kubenswrapper[4774]: I1003 15:15:06.516890 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04fa8c76-a20b-4ae9-86f0-fe4801763d0e-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-rnpx9\" (UID: \"04fa8c76-a20b-4ae9-86f0-fe4801763d0e\") " pod="openstack/ssh-known-hosts-edpm-deployment-rnpx9" Oct 03 15:15:06 crc kubenswrapper[4774]: I1003 15:15:06.516936 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/04fa8c76-a20b-4ae9-86f0-fe4801763d0e-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-rnpx9\" (UID: \"04fa8c76-a20b-4ae9-86f0-fe4801763d0e\") " pod="openstack/ssh-known-hosts-edpm-deployment-rnpx9" Oct 03 15:15:06 crc kubenswrapper[4774]: I1003 15:15:06.517023 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25f98\" (UniqueName: \"kubernetes.io/projected/04fa8c76-a20b-4ae9-86f0-fe4801763d0e-kube-api-access-25f98\") pod \"ssh-known-hosts-edpm-deployment-rnpx9\" (UID: \"04fa8c76-a20b-4ae9-86f0-fe4801763d0e\") " pod="openstack/ssh-known-hosts-edpm-deployment-rnpx9" Oct 03 15:15:06 crc kubenswrapper[4774]: I1003 15:15:06.526340 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04fa8c76-a20b-4ae9-86f0-fe4801763d0e-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-rnpx9\" (UID: \"04fa8c76-a20b-4ae9-86f0-fe4801763d0e\") " pod="openstack/ssh-known-hosts-edpm-deployment-rnpx9" Oct 03 15:15:06 crc kubenswrapper[4774]: I1003 15:15:06.527205 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/04fa8c76-a20b-4ae9-86f0-fe4801763d0e-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-rnpx9\" (UID: \"04fa8c76-a20b-4ae9-86f0-fe4801763d0e\") " pod="openstack/ssh-known-hosts-edpm-deployment-rnpx9" Oct 03 15:15:06 crc kubenswrapper[4774]: I1003 15:15:06.540225 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25f98\" (UniqueName: \"kubernetes.io/projected/04fa8c76-a20b-4ae9-86f0-fe4801763d0e-kube-api-access-25f98\") pod \"ssh-known-hosts-edpm-deployment-rnpx9\" (UID: \"04fa8c76-a20b-4ae9-86f0-fe4801763d0e\") " pod="openstack/ssh-known-hosts-edpm-deployment-rnpx9" Oct 03 15:15:06 crc kubenswrapper[4774]: I1003 15:15:06.605933 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-rnpx9" Oct 03 15:15:07 crc kubenswrapper[4774]: I1003 15:15:07.176425 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-rnpx9"] Oct 03 15:15:07 crc kubenswrapper[4774]: W1003 15:15:07.191097 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04fa8c76_a20b_4ae9_86f0_fe4801763d0e.slice/crio-0532a3ad723253bf3a2c78f15792b2660377a15f3227f583808da10a63230457 WatchSource:0}: Error finding container 0532a3ad723253bf3a2c78f15792b2660377a15f3227f583808da10a63230457: Status 404 returned error can't find the container with id 0532a3ad723253bf3a2c78f15792b2660377a15f3227f583808da10a63230457 Oct 03 15:15:08 crc kubenswrapper[4774]: I1003 15:15:08.202011 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-rnpx9" event={"ID":"04fa8c76-a20b-4ae9-86f0-fe4801763d0e","Type":"ContainerStarted","Data":"06fbb837987451dadc9b94252871beb1760a2c91d18724184549b02ca566c0e9"} Oct 03 15:15:08 crc kubenswrapper[4774]: I1003 15:15:08.203513 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-rnpx9" event={"ID":"04fa8c76-a20b-4ae9-86f0-fe4801763d0e","Type":"ContainerStarted","Data":"0532a3ad723253bf3a2c78f15792b2660377a15f3227f583808da10a63230457"} Oct 03 15:15:08 crc kubenswrapper[4774]: I1003 15:15:08.228250 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-rnpx9" podStartSLOduration=1.587113864 podStartE2EDuration="2.228218279s" podCreationTimestamp="2025-10-03 15:15:06 +0000 UTC" firstStartedPulling="2025-10-03 15:15:07.194650328 +0000 UTC m=+1929.783853820" lastFinishedPulling="2025-10-03 15:15:07.835754743 +0000 UTC m=+1930.424958235" observedRunningTime="2025-10-03 15:15:08.223328827 +0000 UTC m=+1930.812532279" watchObservedRunningTime="2025-10-03 15:15:08.228218279 +0000 UTC m=+1930.817421771" Oct 03 15:15:16 crc kubenswrapper[4774]: I1003 15:15:16.284496 4774 generic.go:334] "Generic (PLEG): container finished" podID="04fa8c76-a20b-4ae9-86f0-fe4801763d0e" containerID="06fbb837987451dadc9b94252871beb1760a2c91d18724184549b02ca566c0e9" exitCode=0 Oct 03 15:15:16 crc kubenswrapper[4774]: I1003 15:15:16.284577 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-rnpx9" event={"ID":"04fa8c76-a20b-4ae9-86f0-fe4801763d0e","Type":"ContainerDied","Data":"06fbb837987451dadc9b94252871beb1760a2c91d18724184549b02ca566c0e9"} Oct 03 15:15:17 crc kubenswrapper[4774]: I1003 15:15:17.783268 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-rnpx9" Oct 03 15:15:17 crc kubenswrapper[4774]: I1003 15:15:17.843459 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04fa8c76-a20b-4ae9-86f0-fe4801763d0e-ssh-key-openstack-edpm-ipam\") pod \"04fa8c76-a20b-4ae9-86f0-fe4801763d0e\" (UID: \"04fa8c76-a20b-4ae9-86f0-fe4801763d0e\") " Oct 03 15:15:17 crc kubenswrapper[4774]: I1003 15:15:17.844069 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25f98\" (UniqueName: \"kubernetes.io/projected/04fa8c76-a20b-4ae9-86f0-fe4801763d0e-kube-api-access-25f98\") pod \"04fa8c76-a20b-4ae9-86f0-fe4801763d0e\" (UID: \"04fa8c76-a20b-4ae9-86f0-fe4801763d0e\") " Oct 03 15:15:17 crc kubenswrapper[4774]: I1003 15:15:17.844178 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/04fa8c76-a20b-4ae9-86f0-fe4801763d0e-inventory-0\") pod \"04fa8c76-a20b-4ae9-86f0-fe4801763d0e\" (UID: \"04fa8c76-a20b-4ae9-86f0-fe4801763d0e\") " Oct 03 15:15:17 crc kubenswrapper[4774]: I1003 15:15:17.848553 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04fa8c76-a20b-4ae9-86f0-fe4801763d0e-kube-api-access-25f98" (OuterVolumeSpecName: "kube-api-access-25f98") pod "04fa8c76-a20b-4ae9-86f0-fe4801763d0e" (UID: "04fa8c76-a20b-4ae9-86f0-fe4801763d0e"). InnerVolumeSpecName "kube-api-access-25f98". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:15:17 crc kubenswrapper[4774]: I1003 15:15:17.870056 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04fa8c76-a20b-4ae9-86f0-fe4801763d0e-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "04fa8c76-a20b-4ae9-86f0-fe4801763d0e" (UID: "04fa8c76-a20b-4ae9-86f0-fe4801763d0e"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:15:17 crc kubenswrapper[4774]: I1003 15:15:17.870466 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04fa8c76-a20b-4ae9-86f0-fe4801763d0e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "04fa8c76-a20b-4ae9-86f0-fe4801763d0e" (UID: "04fa8c76-a20b-4ae9-86f0-fe4801763d0e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:15:17 crc kubenswrapper[4774]: I1003 15:15:17.947002 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25f98\" (UniqueName: \"kubernetes.io/projected/04fa8c76-a20b-4ae9-86f0-fe4801763d0e-kube-api-access-25f98\") on node \"crc\" DevicePath \"\"" Oct 03 15:15:17 crc kubenswrapper[4774]: I1003 15:15:17.947052 4774 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/04fa8c76-a20b-4ae9-86f0-fe4801763d0e-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 03 15:15:17 crc kubenswrapper[4774]: I1003 15:15:17.947066 4774 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04fa8c76-a20b-4ae9-86f0-fe4801763d0e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 03 15:15:18 crc kubenswrapper[4774]: I1003 15:15:18.346146 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-rnpx9" event={"ID":"04fa8c76-a20b-4ae9-86f0-fe4801763d0e","Type":"ContainerDied","Data":"0532a3ad723253bf3a2c78f15792b2660377a15f3227f583808da10a63230457"} Oct 03 15:15:18 crc kubenswrapper[4774]: I1003 15:15:18.346187 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0532a3ad723253bf3a2c78f15792b2660377a15f3227f583808da10a63230457" Oct 03 15:15:18 crc kubenswrapper[4774]: I1003 15:15:18.346262 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-rnpx9" Oct 03 15:15:18 crc kubenswrapper[4774]: I1003 15:15:18.431229 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-mrdws"] Oct 03 15:15:18 crc kubenswrapper[4774]: E1003 15:15:18.431686 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04fa8c76-a20b-4ae9-86f0-fe4801763d0e" containerName="ssh-known-hosts-edpm-deployment" Oct 03 15:15:18 crc kubenswrapper[4774]: I1003 15:15:18.431706 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="04fa8c76-a20b-4ae9-86f0-fe4801763d0e" containerName="ssh-known-hosts-edpm-deployment" Oct 03 15:15:18 crc kubenswrapper[4774]: I1003 15:15:18.431956 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="04fa8c76-a20b-4ae9-86f0-fe4801763d0e" containerName="ssh-known-hosts-edpm-deployment" Oct 03 15:15:18 crc kubenswrapper[4774]: I1003 15:15:18.432709 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mrdws" Oct 03 15:15:18 crc kubenswrapper[4774]: I1003 15:15:18.436104 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 15:15:18 crc kubenswrapper[4774]: I1003 15:15:18.436893 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 15:15:18 crc kubenswrapper[4774]: I1003 15:15:18.437229 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7bdzq" Oct 03 15:15:18 crc kubenswrapper[4774]: I1003 15:15:18.437642 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 15:15:18 crc kubenswrapper[4774]: I1003 15:15:18.439415 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-mrdws"] Oct 03 15:15:18 crc kubenswrapper[4774]: I1003 15:15:18.460038 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81364ab7-a73d-4fef-b065-62983751634b-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mrdws\" (UID: \"81364ab7-a73d-4fef-b065-62983751634b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mrdws" Oct 03 15:15:18 crc kubenswrapper[4774]: I1003 15:15:18.460094 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81364ab7-a73d-4fef-b065-62983751634b-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mrdws\" (UID: \"81364ab7-a73d-4fef-b065-62983751634b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mrdws" Oct 03 15:15:18 crc kubenswrapper[4774]: I1003 15:15:18.460123 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j84rt\" (UniqueName: \"kubernetes.io/projected/81364ab7-a73d-4fef-b065-62983751634b-kube-api-access-j84rt\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mrdws\" (UID: \"81364ab7-a73d-4fef-b065-62983751634b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mrdws" Oct 03 15:15:18 crc kubenswrapper[4774]: I1003 15:15:18.563566 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81364ab7-a73d-4fef-b065-62983751634b-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mrdws\" (UID: \"81364ab7-a73d-4fef-b065-62983751634b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mrdws" Oct 03 15:15:18 crc kubenswrapper[4774]: I1003 15:15:18.563624 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81364ab7-a73d-4fef-b065-62983751634b-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mrdws\" (UID: \"81364ab7-a73d-4fef-b065-62983751634b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mrdws" Oct 03 15:15:18 crc kubenswrapper[4774]: I1003 15:15:18.563668 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j84rt\" (UniqueName: \"kubernetes.io/projected/81364ab7-a73d-4fef-b065-62983751634b-kube-api-access-j84rt\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mrdws\" (UID: \"81364ab7-a73d-4fef-b065-62983751634b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mrdws" Oct 03 15:15:18 crc kubenswrapper[4774]: I1003 15:15:18.568183 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81364ab7-a73d-4fef-b065-62983751634b-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mrdws\" (UID: \"81364ab7-a73d-4fef-b065-62983751634b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mrdws" Oct 03 15:15:18 crc kubenswrapper[4774]: I1003 15:15:18.577163 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81364ab7-a73d-4fef-b065-62983751634b-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mrdws\" (UID: \"81364ab7-a73d-4fef-b065-62983751634b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mrdws" Oct 03 15:15:18 crc kubenswrapper[4774]: I1003 15:15:18.580642 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j84rt\" (UniqueName: \"kubernetes.io/projected/81364ab7-a73d-4fef-b065-62983751634b-kube-api-access-j84rt\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mrdws\" (UID: \"81364ab7-a73d-4fef-b065-62983751634b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mrdws" Oct 03 15:15:18 crc kubenswrapper[4774]: I1003 15:15:18.748866 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mrdws" Oct 03 15:15:19 crc kubenswrapper[4774]: I1003 15:15:19.338000 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-mrdws"] Oct 03 15:15:19 crc kubenswrapper[4774]: I1003 15:15:19.359704 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mrdws" event={"ID":"81364ab7-a73d-4fef-b065-62983751634b","Type":"ContainerStarted","Data":"2d18efcb2e826eacf0f1df4a3c707e241ffd58c52c7e83dc6732b2d60ee2172c"} Oct 03 15:15:20 crc kubenswrapper[4774]: I1003 15:15:20.370239 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mrdws" event={"ID":"81364ab7-a73d-4fef-b065-62983751634b","Type":"ContainerStarted","Data":"2abe1213b00d104055d7be4264c8e15faedc627e0a7d1d97a2aea2f1ba082e90"} Oct 03 15:15:20 crc kubenswrapper[4774]: I1003 15:15:20.395342 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mrdws" podStartSLOduration=1.8284421229999999 podStartE2EDuration="2.395322331s" podCreationTimestamp="2025-10-03 15:15:18 +0000 UTC" firstStartedPulling="2025-10-03 15:15:19.350398336 +0000 UTC m=+1941.939601798" lastFinishedPulling="2025-10-03 15:15:19.917278524 +0000 UTC m=+1942.506482006" observedRunningTime="2025-10-03 15:15:20.387902996 +0000 UTC m=+1942.977106488" watchObservedRunningTime="2025-10-03 15:15:20.395322331 +0000 UTC m=+1942.984525793" Oct 03 15:15:28 crc kubenswrapper[4774]: I1003 15:15:28.447901 4774 generic.go:334] "Generic (PLEG): container finished" podID="81364ab7-a73d-4fef-b065-62983751634b" containerID="2abe1213b00d104055d7be4264c8e15faedc627e0a7d1d97a2aea2f1ba082e90" exitCode=0 Oct 03 15:15:28 crc kubenswrapper[4774]: I1003 15:15:28.447970 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mrdws" event={"ID":"81364ab7-a73d-4fef-b065-62983751634b","Type":"ContainerDied","Data":"2abe1213b00d104055d7be4264c8e15faedc627e0a7d1d97a2aea2f1ba082e90"} Oct 03 15:15:29 crc kubenswrapper[4774]: I1003 15:15:29.846610 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mrdws" Oct 03 15:15:29 crc kubenswrapper[4774]: I1003 15:15:29.886128 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81364ab7-a73d-4fef-b065-62983751634b-ssh-key\") pod \"81364ab7-a73d-4fef-b065-62983751634b\" (UID: \"81364ab7-a73d-4fef-b065-62983751634b\") " Oct 03 15:15:29 crc kubenswrapper[4774]: I1003 15:15:29.886259 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j84rt\" (UniqueName: \"kubernetes.io/projected/81364ab7-a73d-4fef-b065-62983751634b-kube-api-access-j84rt\") pod \"81364ab7-a73d-4fef-b065-62983751634b\" (UID: \"81364ab7-a73d-4fef-b065-62983751634b\") " Oct 03 15:15:29 crc kubenswrapper[4774]: I1003 15:15:29.886300 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81364ab7-a73d-4fef-b065-62983751634b-inventory\") pod \"81364ab7-a73d-4fef-b065-62983751634b\" (UID: \"81364ab7-a73d-4fef-b065-62983751634b\") " Oct 03 15:15:29 crc kubenswrapper[4774]: I1003 15:15:29.891569 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81364ab7-a73d-4fef-b065-62983751634b-kube-api-access-j84rt" (OuterVolumeSpecName: "kube-api-access-j84rt") pod "81364ab7-a73d-4fef-b065-62983751634b" (UID: "81364ab7-a73d-4fef-b065-62983751634b"). InnerVolumeSpecName "kube-api-access-j84rt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:15:29 crc kubenswrapper[4774]: I1003 15:15:29.914527 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81364ab7-a73d-4fef-b065-62983751634b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "81364ab7-a73d-4fef-b065-62983751634b" (UID: "81364ab7-a73d-4fef-b065-62983751634b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:15:29 crc kubenswrapper[4774]: I1003 15:15:29.939834 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81364ab7-a73d-4fef-b065-62983751634b-inventory" (OuterVolumeSpecName: "inventory") pod "81364ab7-a73d-4fef-b065-62983751634b" (UID: "81364ab7-a73d-4fef-b065-62983751634b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:15:29 crc kubenswrapper[4774]: I1003 15:15:29.988449 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j84rt\" (UniqueName: \"kubernetes.io/projected/81364ab7-a73d-4fef-b065-62983751634b-kube-api-access-j84rt\") on node \"crc\" DevicePath \"\"" Oct 03 15:15:29 crc kubenswrapper[4774]: I1003 15:15:29.988517 4774 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81364ab7-a73d-4fef-b065-62983751634b-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 15:15:29 crc kubenswrapper[4774]: I1003 15:15:29.988530 4774 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81364ab7-a73d-4fef-b065-62983751634b-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 15:15:30 crc kubenswrapper[4774]: I1003 15:15:30.473351 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mrdws" event={"ID":"81364ab7-a73d-4fef-b065-62983751634b","Type":"ContainerDied","Data":"2d18efcb2e826eacf0f1df4a3c707e241ffd58c52c7e83dc6732b2d60ee2172c"} Oct 03 15:15:30 crc kubenswrapper[4774]: I1003 15:15:30.473457 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d18efcb2e826eacf0f1df4a3c707e241ffd58c52c7e83dc6732b2d60ee2172c" Oct 03 15:15:30 crc kubenswrapper[4774]: I1003 15:15:30.473550 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mrdws" Oct 03 15:15:30 crc kubenswrapper[4774]: I1003 15:15:30.577450 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dshx4"] Oct 03 15:15:30 crc kubenswrapper[4774]: E1003 15:15:30.578250 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81364ab7-a73d-4fef-b065-62983751634b" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 03 15:15:30 crc kubenswrapper[4774]: I1003 15:15:30.578273 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="81364ab7-a73d-4fef-b065-62983751634b" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 03 15:15:30 crc kubenswrapper[4774]: I1003 15:15:30.578571 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="81364ab7-a73d-4fef-b065-62983751634b" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 03 15:15:30 crc kubenswrapper[4774]: I1003 15:15:30.579319 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dshx4" Oct 03 15:15:30 crc kubenswrapper[4774]: I1003 15:15:30.586174 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 15:15:30 crc kubenswrapper[4774]: I1003 15:15:30.586720 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7bdzq" Oct 03 15:15:30 crc kubenswrapper[4774]: I1003 15:15:30.587160 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 15:15:30 crc kubenswrapper[4774]: I1003 15:15:30.591231 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dshx4"] Oct 03 15:15:30 crc kubenswrapper[4774]: I1003 15:15:30.593952 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 15:15:30 crc kubenswrapper[4774]: I1003 15:15:30.610437 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpnl4\" (UniqueName: \"kubernetes.io/projected/776f45f5-5644-428e-a25f-9e3b36960fd9-kube-api-access-mpnl4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dshx4\" (UID: \"776f45f5-5644-428e-a25f-9e3b36960fd9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dshx4" Oct 03 15:15:30 crc kubenswrapper[4774]: I1003 15:15:30.610514 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/776f45f5-5644-428e-a25f-9e3b36960fd9-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dshx4\" (UID: \"776f45f5-5644-428e-a25f-9e3b36960fd9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dshx4" Oct 03 15:15:30 crc kubenswrapper[4774]: I1003 15:15:30.610664 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/776f45f5-5644-428e-a25f-9e3b36960fd9-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dshx4\" (UID: \"776f45f5-5644-428e-a25f-9e3b36960fd9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dshx4" Oct 03 15:15:30 crc kubenswrapper[4774]: I1003 15:15:30.711310 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/776f45f5-5644-428e-a25f-9e3b36960fd9-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dshx4\" (UID: \"776f45f5-5644-428e-a25f-9e3b36960fd9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dshx4" Oct 03 15:15:30 crc kubenswrapper[4774]: I1003 15:15:30.711386 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpnl4\" (UniqueName: \"kubernetes.io/projected/776f45f5-5644-428e-a25f-9e3b36960fd9-kube-api-access-mpnl4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dshx4\" (UID: \"776f45f5-5644-428e-a25f-9e3b36960fd9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dshx4" Oct 03 15:15:30 crc kubenswrapper[4774]: I1003 15:15:30.711446 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/776f45f5-5644-428e-a25f-9e3b36960fd9-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dshx4\" (UID: \"776f45f5-5644-428e-a25f-9e3b36960fd9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dshx4" Oct 03 15:15:30 crc kubenswrapper[4774]: I1003 15:15:30.716348 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/776f45f5-5644-428e-a25f-9e3b36960fd9-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dshx4\" (UID: \"776f45f5-5644-428e-a25f-9e3b36960fd9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dshx4" Oct 03 15:15:30 crc kubenswrapper[4774]: I1003 15:15:30.716581 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/776f45f5-5644-428e-a25f-9e3b36960fd9-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dshx4\" (UID: \"776f45f5-5644-428e-a25f-9e3b36960fd9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dshx4" Oct 03 15:15:30 crc kubenswrapper[4774]: I1003 15:15:30.727267 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpnl4\" (UniqueName: \"kubernetes.io/projected/776f45f5-5644-428e-a25f-9e3b36960fd9-kube-api-access-mpnl4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dshx4\" (UID: \"776f45f5-5644-428e-a25f-9e3b36960fd9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dshx4" Oct 03 15:15:30 crc kubenswrapper[4774]: I1003 15:15:30.906604 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dshx4" Oct 03 15:15:31 crc kubenswrapper[4774]: I1003 15:15:31.467982 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dshx4"] Oct 03 15:15:31 crc kubenswrapper[4774]: W1003 15:15:31.475873 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod776f45f5_5644_428e_a25f_9e3b36960fd9.slice/crio-68df96a1ac600d83d3ac3e6894664a9c4c46d47856e99321b10fc1600a983067 WatchSource:0}: Error finding container 68df96a1ac600d83d3ac3e6894664a9c4c46d47856e99321b10fc1600a983067: Status 404 returned error can't find the container with id 68df96a1ac600d83d3ac3e6894664a9c4c46d47856e99321b10fc1600a983067 Oct 03 15:15:32 crc kubenswrapper[4774]: I1003 15:15:32.501253 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dshx4" event={"ID":"776f45f5-5644-428e-a25f-9e3b36960fd9","Type":"ContainerStarted","Data":"2f858c022ef8924b3d381b99b7c5f5b80d0c3ff47b98935190fe22ee0198f8f7"} Oct 03 15:15:32 crc kubenswrapper[4774]: I1003 15:15:32.501783 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dshx4" event={"ID":"776f45f5-5644-428e-a25f-9e3b36960fd9","Type":"ContainerStarted","Data":"68df96a1ac600d83d3ac3e6894664a9c4c46d47856e99321b10fc1600a983067"} Oct 03 15:15:32 crc kubenswrapper[4774]: I1003 15:15:32.520155 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dshx4" podStartSLOduration=1.8922099220000002 podStartE2EDuration="2.520131549s" podCreationTimestamp="2025-10-03 15:15:30 +0000 UTC" firstStartedPulling="2025-10-03 15:15:31.479336017 +0000 UTC m=+1954.068539469" lastFinishedPulling="2025-10-03 15:15:32.107257604 +0000 UTC m=+1954.696461096" observedRunningTime="2025-10-03 15:15:32.515202467 +0000 UTC m=+1955.104405939" watchObservedRunningTime="2025-10-03 15:15:32.520131549 +0000 UTC m=+1955.109335011" Oct 03 15:15:42 crc kubenswrapper[4774]: I1003 15:15:42.639535 4774 generic.go:334] "Generic (PLEG): container finished" podID="776f45f5-5644-428e-a25f-9e3b36960fd9" containerID="2f858c022ef8924b3d381b99b7c5f5b80d0c3ff47b98935190fe22ee0198f8f7" exitCode=0 Oct 03 15:15:42 crc kubenswrapper[4774]: I1003 15:15:42.639624 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dshx4" event={"ID":"776f45f5-5644-428e-a25f-9e3b36960fd9","Type":"ContainerDied","Data":"2f858c022ef8924b3d381b99b7c5f5b80d0c3ff47b98935190fe22ee0198f8f7"} Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.125614 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dshx4" Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.300433 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpnl4\" (UniqueName: \"kubernetes.io/projected/776f45f5-5644-428e-a25f-9e3b36960fd9-kube-api-access-mpnl4\") pod \"776f45f5-5644-428e-a25f-9e3b36960fd9\" (UID: \"776f45f5-5644-428e-a25f-9e3b36960fd9\") " Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.300510 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/776f45f5-5644-428e-a25f-9e3b36960fd9-inventory\") pod \"776f45f5-5644-428e-a25f-9e3b36960fd9\" (UID: \"776f45f5-5644-428e-a25f-9e3b36960fd9\") " Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.300817 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/776f45f5-5644-428e-a25f-9e3b36960fd9-ssh-key\") pod \"776f45f5-5644-428e-a25f-9e3b36960fd9\" (UID: \"776f45f5-5644-428e-a25f-9e3b36960fd9\") " Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.308130 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/776f45f5-5644-428e-a25f-9e3b36960fd9-kube-api-access-mpnl4" (OuterVolumeSpecName: "kube-api-access-mpnl4") pod "776f45f5-5644-428e-a25f-9e3b36960fd9" (UID: "776f45f5-5644-428e-a25f-9e3b36960fd9"). InnerVolumeSpecName "kube-api-access-mpnl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.326908 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/776f45f5-5644-428e-a25f-9e3b36960fd9-inventory" (OuterVolumeSpecName: "inventory") pod "776f45f5-5644-428e-a25f-9e3b36960fd9" (UID: "776f45f5-5644-428e-a25f-9e3b36960fd9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.329670 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/776f45f5-5644-428e-a25f-9e3b36960fd9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "776f45f5-5644-428e-a25f-9e3b36960fd9" (UID: "776f45f5-5644-428e-a25f-9e3b36960fd9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.405068 4774 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/776f45f5-5644-428e-a25f-9e3b36960fd9-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.405108 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpnl4\" (UniqueName: \"kubernetes.io/projected/776f45f5-5644-428e-a25f-9e3b36960fd9-kube-api-access-mpnl4\") on node \"crc\" DevicePath \"\"" Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.405122 4774 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/776f45f5-5644-428e-a25f-9e3b36960fd9-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.665659 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dshx4" event={"ID":"776f45f5-5644-428e-a25f-9e3b36960fd9","Type":"ContainerDied","Data":"68df96a1ac600d83d3ac3e6894664a9c4c46d47856e99321b10fc1600a983067"} Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.665704 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68df96a1ac600d83d3ac3e6894664a9c4c46d47856e99321b10fc1600a983067" Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.665744 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dshx4" Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.777915 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf"] Oct 03 15:15:44 crc kubenswrapper[4774]: E1003 15:15:44.778433 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="776f45f5-5644-428e-a25f-9e3b36960fd9" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.778455 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="776f45f5-5644-428e-a25f-9e3b36960fd9" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.778680 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="776f45f5-5644-428e-a25f-9e3b36960fd9" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.779533 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.783919 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.783960 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.783924 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.783924 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.784606 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.784800 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.784956 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.784763 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7bdzq" Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.792719 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf"] Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.912969 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.913009 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.913045 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.913083 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.913113 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.913146 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.913163 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.913194 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.913216 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2ba66309-584b-4165-86a5-ca30af49d159-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.913235 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2ba66309-584b-4165-86a5-ca30af49d159-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.913256 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x94rg\" (UniqueName: \"kubernetes.io/projected/2ba66309-584b-4165-86a5-ca30af49d159-kube-api-access-x94rg\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.913284 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2ba66309-584b-4165-86a5-ca30af49d159-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.913315 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2ba66309-584b-4165-86a5-ca30af49d159-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:44 crc kubenswrapper[4774]: I1003 15:15:44.913347 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:45 crc kubenswrapper[4774]: I1003 15:15:45.014765 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:45 crc kubenswrapper[4774]: I1003 15:15:45.015127 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:45 crc kubenswrapper[4774]: I1003 15:15:45.015252 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2ba66309-584b-4165-86a5-ca30af49d159-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:45 crc kubenswrapper[4774]: I1003 15:15:45.015396 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2ba66309-584b-4165-86a5-ca30af49d159-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:45 crc kubenswrapper[4774]: I1003 15:15:45.015507 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x94rg\" (UniqueName: \"kubernetes.io/projected/2ba66309-584b-4165-86a5-ca30af49d159-kube-api-access-x94rg\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:45 crc kubenswrapper[4774]: I1003 15:15:45.015647 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2ba66309-584b-4165-86a5-ca30af49d159-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:45 crc kubenswrapper[4774]: I1003 15:15:45.015779 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2ba66309-584b-4165-86a5-ca30af49d159-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:45 crc kubenswrapper[4774]: I1003 15:15:45.015902 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:45 crc kubenswrapper[4774]: I1003 15:15:45.016066 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:45 crc kubenswrapper[4774]: I1003 15:15:45.016172 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:45 crc kubenswrapper[4774]: I1003 15:15:45.016322 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:45 crc kubenswrapper[4774]: I1003 15:15:45.016492 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:45 crc kubenswrapper[4774]: I1003 15:15:45.016594 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:45 crc kubenswrapper[4774]: I1003 15:15:45.016712 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:45 crc kubenswrapper[4774]: I1003 15:15:45.020141 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:45 crc kubenswrapper[4774]: I1003 15:15:45.020894 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:45 crc kubenswrapper[4774]: I1003 15:15:45.021054 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2ba66309-584b-4165-86a5-ca30af49d159-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:45 crc kubenswrapper[4774]: I1003 15:15:45.021720 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2ba66309-584b-4165-86a5-ca30af49d159-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:45 crc kubenswrapper[4774]: I1003 15:15:45.021836 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2ba66309-584b-4165-86a5-ca30af49d159-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:45 crc kubenswrapper[4774]: I1003 15:15:45.022213 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:45 crc kubenswrapper[4774]: I1003 15:15:45.025901 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:45 crc kubenswrapper[4774]: I1003 15:15:45.026648 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:45 crc kubenswrapper[4774]: I1003 15:15:45.028073 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:45 crc kubenswrapper[4774]: I1003 15:15:45.033609 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:45 crc kubenswrapper[4774]: I1003 15:15:45.034459 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:45 crc kubenswrapper[4774]: I1003 15:15:45.041550 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2ba66309-584b-4165-86a5-ca30af49d159-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:45 crc kubenswrapper[4774]: I1003 15:15:45.046109 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x94rg\" (UniqueName: \"kubernetes.io/projected/2ba66309-584b-4165-86a5-ca30af49d159-kube-api-access-x94rg\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:45 crc kubenswrapper[4774]: I1003 15:15:45.046731 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x58cf\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:45 crc kubenswrapper[4774]: I1003 15:15:45.105072 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:15:45 crc kubenswrapper[4774]: I1003 15:15:45.640321 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf"] Oct 03 15:15:45 crc kubenswrapper[4774]: I1003 15:15:45.677390 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" event={"ID":"2ba66309-584b-4165-86a5-ca30af49d159","Type":"ContainerStarted","Data":"a03293480945138c4a616793b97250193d8c50ea4486874d19993837fb7837b3"} Oct 03 15:15:46 crc kubenswrapper[4774]: I1003 15:15:46.694317 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" event={"ID":"2ba66309-584b-4165-86a5-ca30af49d159","Type":"ContainerStarted","Data":"d3f3c1351296e967b7b39be9fd8f66aa9d7a131d5653db71eeeb653a814eb5f0"} Oct 03 15:15:46 crc kubenswrapper[4774]: I1003 15:15:46.747103 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" podStartSLOduration=2.310121739 podStartE2EDuration="2.747080723s" podCreationTimestamp="2025-10-03 15:15:44 +0000 UTC" firstStartedPulling="2025-10-03 15:15:45.638974936 +0000 UTC m=+1968.228178388" lastFinishedPulling="2025-10-03 15:15:46.07593388 +0000 UTC m=+1968.665137372" observedRunningTime="2025-10-03 15:15:46.730954642 +0000 UTC m=+1969.320158114" watchObservedRunningTime="2025-10-03 15:15:46.747080723 +0000 UTC m=+1969.336284185" Oct 03 15:16:26 crc kubenswrapper[4774]: I1003 15:16:26.135427 4774 generic.go:334] "Generic (PLEG): container finished" podID="2ba66309-584b-4165-86a5-ca30af49d159" containerID="d3f3c1351296e967b7b39be9fd8f66aa9d7a131d5653db71eeeb653a814eb5f0" exitCode=0 Oct 03 15:16:26 crc kubenswrapper[4774]: I1003 15:16:26.135539 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" event={"ID":"2ba66309-584b-4165-86a5-ca30af49d159","Type":"ContainerDied","Data":"d3f3c1351296e967b7b39be9fd8f66aa9d7a131d5653db71eeeb653a814eb5f0"} Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.548490 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.726761 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-repo-setup-combined-ca-bundle\") pod \"2ba66309-584b-4165-86a5-ca30af49d159\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.726809 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-libvirt-combined-ca-bundle\") pod \"2ba66309-584b-4165-86a5-ca30af49d159\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.726840 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-inventory\") pod \"2ba66309-584b-4165-86a5-ca30af49d159\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.726870 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-ovn-combined-ca-bundle\") pod \"2ba66309-584b-4165-86a5-ca30af49d159\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.727633 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x94rg\" (UniqueName: \"kubernetes.io/projected/2ba66309-584b-4165-86a5-ca30af49d159-kube-api-access-x94rg\") pod \"2ba66309-584b-4165-86a5-ca30af49d159\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.727686 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2ba66309-584b-4165-86a5-ca30af49d159-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"2ba66309-584b-4165-86a5-ca30af49d159\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.727778 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-telemetry-combined-ca-bundle\") pod \"2ba66309-584b-4165-86a5-ca30af49d159\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.727809 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-neutron-metadata-combined-ca-bundle\") pod \"2ba66309-584b-4165-86a5-ca30af49d159\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.727835 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-bootstrap-combined-ca-bundle\") pod \"2ba66309-584b-4165-86a5-ca30af49d159\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.727887 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-nova-combined-ca-bundle\") pod \"2ba66309-584b-4165-86a5-ca30af49d159\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.727954 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-ssh-key\") pod \"2ba66309-584b-4165-86a5-ca30af49d159\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.728033 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2ba66309-584b-4165-86a5-ca30af49d159-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"2ba66309-584b-4165-86a5-ca30af49d159\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.728064 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2ba66309-584b-4165-86a5-ca30af49d159-openstack-edpm-ipam-ovn-default-certs-0\") pod \"2ba66309-584b-4165-86a5-ca30af49d159\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.728119 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2ba66309-584b-4165-86a5-ca30af49d159-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"2ba66309-584b-4165-86a5-ca30af49d159\" (UID: \"2ba66309-584b-4165-86a5-ca30af49d159\") " Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.735894 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "2ba66309-584b-4165-86a5-ca30af49d159" (UID: "2ba66309-584b-4165-86a5-ca30af49d159"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.735916 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "2ba66309-584b-4165-86a5-ca30af49d159" (UID: "2ba66309-584b-4165-86a5-ca30af49d159"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.735969 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "2ba66309-584b-4165-86a5-ca30af49d159" (UID: "2ba66309-584b-4165-86a5-ca30af49d159"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.735980 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "2ba66309-584b-4165-86a5-ca30af49d159" (UID: "2ba66309-584b-4165-86a5-ca30af49d159"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.736064 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ba66309-584b-4165-86a5-ca30af49d159-kube-api-access-x94rg" (OuterVolumeSpecName: "kube-api-access-x94rg") pod "2ba66309-584b-4165-86a5-ca30af49d159" (UID: "2ba66309-584b-4165-86a5-ca30af49d159"). InnerVolumeSpecName "kube-api-access-x94rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.736638 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "2ba66309-584b-4165-86a5-ca30af49d159" (UID: "2ba66309-584b-4165-86a5-ca30af49d159"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.737113 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ba66309-584b-4165-86a5-ca30af49d159-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "2ba66309-584b-4165-86a5-ca30af49d159" (UID: "2ba66309-584b-4165-86a5-ca30af49d159"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.737300 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ba66309-584b-4165-86a5-ca30af49d159-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "2ba66309-584b-4165-86a5-ca30af49d159" (UID: "2ba66309-584b-4165-86a5-ca30af49d159"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.737541 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "2ba66309-584b-4165-86a5-ca30af49d159" (UID: "2ba66309-584b-4165-86a5-ca30af49d159"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.737978 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ba66309-584b-4165-86a5-ca30af49d159-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "2ba66309-584b-4165-86a5-ca30af49d159" (UID: "2ba66309-584b-4165-86a5-ca30af49d159"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.739351 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ba66309-584b-4165-86a5-ca30af49d159-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "2ba66309-584b-4165-86a5-ca30af49d159" (UID: "2ba66309-584b-4165-86a5-ca30af49d159"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.741635 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "2ba66309-584b-4165-86a5-ca30af49d159" (UID: "2ba66309-584b-4165-86a5-ca30af49d159"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.759575 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2ba66309-584b-4165-86a5-ca30af49d159" (UID: "2ba66309-584b-4165-86a5-ca30af49d159"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.774390 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-inventory" (OuterVolumeSpecName: "inventory") pod "2ba66309-584b-4165-86a5-ca30af49d159" (UID: "2ba66309-584b-4165-86a5-ca30af49d159"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.830943 4774 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2ba66309-584b-4165-86a5-ca30af49d159-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.830979 4774 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2ba66309-584b-4165-86a5-ca30af49d159-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.830991 4774 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2ba66309-584b-4165-86a5-ca30af49d159-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.831006 4774 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.831018 4774 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.831027 4774 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.831036 4774 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.831045 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x94rg\" (UniqueName: \"kubernetes.io/projected/2ba66309-584b-4165-86a5-ca30af49d159-kube-api-access-x94rg\") on node \"crc\" DevicePath \"\"" Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.831056 4774 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2ba66309-584b-4165-86a5-ca30af49d159-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.831065 4774 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.831074 4774 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.831083 4774 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.831093 4774 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:16:27 crc kubenswrapper[4774]: I1003 15:16:27.831100 4774 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2ba66309-584b-4165-86a5-ca30af49d159-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 15:16:28 crc kubenswrapper[4774]: I1003 15:16:28.160952 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" event={"ID":"2ba66309-584b-4165-86a5-ca30af49d159","Type":"ContainerDied","Data":"a03293480945138c4a616793b97250193d8c50ea4486874d19993837fb7837b3"} Oct 03 15:16:28 crc kubenswrapper[4774]: I1003 15:16:28.161263 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a03293480945138c4a616793b97250193d8c50ea4486874d19993837fb7837b3" Oct 03 15:16:28 crc kubenswrapper[4774]: I1003 15:16:28.161024 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x58cf" Oct 03 15:16:28 crc kubenswrapper[4774]: I1003 15:16:28.287400 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-45z47"] Oct 03 15:16:28 crc kubenswrapper[4774]: E1003 15:16:28.287884 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ba66309-584b-4165-86a5-ca30af49d159" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 03 15:16:28 crc kubenswrapper[4774]: I1003 15:16:28.287900 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ba66309-584b-4165-86a5-ca30af49d159" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 03 15:16:28 crc kubenswrapper[4774]: I1003 15:16:28.288119 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ba66309-584b-4165-86a5-ca30af49d159" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 03 15:16:28 crc kubenswrapper[4774]: I1003 15:16:28.288958 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45z47" Oct 03 15:16:28 crc kubenswrapper[4774]: I1003 15:16:28.293225 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 15:16:28 crc kubenswrapper[4774]: I1003 15:16:28.293327 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 15:16:28 crc kubenswrapper[4774]: I1003 15:16:28.293460 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 03 15:16:28 crc kubenswrapper[4774]: I1003 15:16:28.293240 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 15:16:28 crc kubenswrapper[4774]: I1003 15:16:28.293836 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7bdzq" Oct 03 15:16:28 crc kubenswrapper[4774]: I1003 15:16:28.303701 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-45z47"] Oct 03 15:16:28 crc kubenswrapper[4774]: I1003 15:16:28.444001 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/579db85b-3b4c-45b4-8bae-1b5a02c80e15-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-45z47\" (UID: \"579db85b-3b4c-45b4-8bae-1b5a02c80e15\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45z47" Oct 03 15:16:28 crc kubenswrapper[4774]: I1003 15:16:28.444099 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/579db85b-3b4c-45b4-8bae-1b5a02c80e15-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-45z47\" (UID: \"579db85b-3b4c-45b4-8bae-1b5a02c80e15\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45z47" Oct 03 15:16:28 crc kubenswrapper[4774]: I1003 15:16:28.444338 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ppnt\" (UniqueName: \"kubernetes.io/projected/579db85b-3b4c-45b4-8bae-1b5a02c80e15-kube-api-access-2ppnt\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-45z47\" (UID: \"579db85b-3b4c-45b4-8bae-1b5a02c80e15\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45z47" Oct 03 15:16:28 crc kubenswrapper[4774]: I1003 15:16:28.444701 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/579db85b-3b4c-45b4-8bae-1b5a02c80e15-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-45z47\" (UID: \"579db85b-3b4c-45b4-8bae-1b5a02c80e15\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45z47" Oct 03 15:16:28 crc kubenswrapper[4774]: I1003 15:16:28.444912 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/579db85b-3b4c-45b4-8bae-1b5a02c80e15-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-45z47\" (UID: \"579db85b-3b4c-45b4-8bae-1b5a02c80e15\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45z47" Oct 03 15:16:28 crc kubenswrapper[4774]: I1003 15:16:28.547336 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/579db85b-3b4c-45b4-8bae-1b5a02c80e15-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-45z47\" (UID: \"579db85b-3b4c-45b4-8bae-1b5a02c80e15\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45z47" Oct 03 15:16:28 crc kubenswrapper[4774]: I1003 15:16:28.547529 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/579db85b-3b4c-45b4-8bae-1b5a02c80e15-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-45z47\" (UID: \"579db85b-3b4c-45b4-8bae-1b5a02c80e15\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45z47" Oct 03 15:16:28 crc kubenswrapper[4774]: I1003 15:16:28.547656 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/579db85b-3b4c-45b4-8bae-1b5a02c80e15-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-45z47\" (UID: \"579db85b-3b4c-45b4-8bae-1b5a02c80e15\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45z47" Oct 03 15:16:28 crc kubenswrapper[4774]: I1003 15:16:28.547707 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/579db85b-3b4c-45b4-8bae-1b5a02c80e15-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-45z47\" (UID: \"579db85b-3b4c-45b4-8bae-1b5a02c80e15\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45z47" Oct 03 15:16:28 crc kubenswrapper[4774]: I1003 15:16:28.547778 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ppnt\" (UniqueName: \"kubernetes.io/projected/579db85b-3b4c-45b4-8bae-1b5a02c80e15-kube-api-access-2ppnt\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-45z47\" (UID: \"579db85b-3b4c-45b4-8bae-1b5a02c80e15\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45z47" Oct 03 15:16:28 crc kubenswrapper[4774]: I1003 15:16:28.552491 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/579db85b-3b4c-45b4-8bae-1b5a02c80e15-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-45z47\" (UID: \"579db85b-3b4c-45b4-8bae-1b5a02c80e15\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45z47" Oct 03 15:16:28 crc kubenswrapper[4774]: I1003 15:16:28.556250 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/579db85b-3b4c-45b4-8bae-1b5a02c80e15-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-45z47\" (UID: \"579db85b-3b4c-45b4-8bae-1b5a02c80e15\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45z47" Oct 03 15:16:28 crc kubenswrapper[4774]: I1003 15:16:28.560280 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/579db85b-3b4c-45b4-8bae-1b5a02c80e15-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-45z47\" (UID: \"579db85b-3b4c-45b4-8bae-1b5a02c80e15\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45z47" Oct 03 15:16:28 crc kubenswrapper[4774]: I1003 15:16:28.560645 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/579db85b-3b4c-45b4-8bae-1b5a02c80e15-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-45z47\" (UID: \"579db85b-3b4c-45b4-8bae-1b5a02c80e15\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45z47" Oct 03 15:16:28 crc kubenswrapper[4774]: I1003 15:16:28.564001 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ppnt\" (UniqueName: \"kubernetes.io/projected/579db85b-3b4c-45b4-8bae-1b5a02c80e15-kube-api-access-2ppnt\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-45z47\" (UID: \"579db85b-3b4c-45b4-8bae-1b5a02c80e15\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45z47" Oct 03 15:16:28 crc kubenswrapper[4774]: I1003 15:16:28.622025 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45z47" Oct 03 15:16:28 crc kubenswrapper[4774]: I1003 15:16:28.948885 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-45z47"] Oct 03 15:16:28 crc kubenswrapper[4774]: I1003 15:16:28.953148 4774 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 15:16:29 crc kubenswrapper[4774]: I1003 15:16:29.170877 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45z47" event={"ID":"579db85b-3b4c-45b4-8bae-1b5a02c80e15","Type":"ContainerStarted","Data":"23571a3b7df12dcbe5a06e820ce4bcbc7a6933bb6a8b8f5811caba7539d14444"} Oct 03 15:16:30 crc kubenswrapper[4774]: I1003 15:16:30.195638 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45z47" event={"ID":"579db85b-3b4c-45b4-8bae-1b5a02c80e15","Type":"ContainerStarted","Data":"ce3da622cedc77d908ab098dd1dbe442b8cbcac1c4d096b933fa28c7ea4d8c04"} Oct 03 15:16:30 crc kubenswrapper[4774]: I1003 15:16:30.221914 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45z47" podStartSLOduration=1.727732405 podStartE2EDuration="2.221896723s" podCreationTimestamp="2025-10-03 15:16:28 +0000 UTC" firstStartedPulling="2025-10-03 15:16:28.95277467 +0000 UTC m=+2011.541978122" lastFinishedPulling="2025-10-03 15:16:29.446938978 +0000 UTC m=+2012.036142440" observedRunningTime="2025-10-03 15:16:30.210666975 +0000 UTC m=+2012.799870417" watchObservedRunningTime="2025-10-03 15:16:30.221896723 +0000 UTC m=+2012.811100175" Oct 03 15:16:50 crc kubenswrapper[4774]: I1003 15:16:50.653178 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:16:50 crc kubenswrapper[4774]: I1003 15:16:50.655146 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:17:20 crc kubenswrapper[4774]: I1003 15:17:20.658663 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:17:20 crc kubenswrapper[4774]: I1003 15:17:20.659516 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:17:32 crc kubenswrapper[4774]: I1003 15:17:32.810177 4774 generic.go:334] "Generic (PLEG): container finished" podID="579db85b-3b4c-45b4-8bae-1b5a02c80e15" containerID="ce3da622cedc77d908ab098dd1dbe442b8cbcac1c4d096b933fa28c7ea4d8c04" exitCode=0 Oct 03 15:17:32 crc kubenswrapper[4774]: I1003 15:17:32.810270 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45z47" event={"ID":"579db85b-3b4c-45b4-8bae-1b5a02c80e15","Type":"ContainerDied","Data":"ce3da622cedc77d908ab098dd1dbe442b8cbcac1c4d096b933fa28c7ea4d8c04"} Oct 03 15:17:34 crc kubenswrapper[4774]: I1003 15:17:34.240968 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45z47" Oct 03 15:17:34 crc kubenswrapper[4774]: I1003 15:17:34.381015 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/579db85b-3b4c-45b4-8bae-1b5a02c80e15-ovn-combined-ca-bundle\") pod \"579db85b-3b4c-45b4-8bae-1b5a02c80e15\" (UID: \"579db85b-3b4c-45b4-8bae-1b5a02c80e15\") " Oct 03 15:17:34 crc kubenswrapper[4774]: I1003 15:17:34.381106 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/579db85b-3b4c-45b4-8bae-1b5a02c80e15-inventory\") pod \"579db85b-3b4c-45b4-8bae-1b5a02c80e15\" (UID: \"579db85b-3b4c-45b4-8bae-1b5a02c80e15\") " Oct 03 15:17:34 crc kubenswrapper[4774]: I1003 15:17:34.381224 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/579db85b-3b4c-45b4-8bae-1b5a02c80e15-ssh-key\") pod \"579db85b-3b4c-45b4-8bae-1b5a02c80e15\" (UID: \"579db85b-3b4c-45b4-8bae-1b5a02c80e15\") " Oct 03 15:17:34 crc kubenswrapper[4774]: I1003 15:17:34.381346 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/579db85b-3b4c-45b4-8bae-1b5a02c80e15-ovncontroller-config-0\") pod \"579db85b-3b4c-45b4-8bae-1b5a02c80e15\" (UID: \"579db85b-3b4c-45b4-8bae-1b5a02c80e15\") " Oct 03 15:17:34 crc kubenswrapper[4774]: I1003 15:17:34.381440 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ppnt\" (UniqueName: \"kubernetes.io/projected/579db85b-3b4c-45b4-8bae-1b5a02c80e15-kube-api-access-2ppnt\") pod \"579db85b-3b4c-45b4-8bae-1b5a02c80e15\" (UID: \"579db85b-3b4c-45b4-8bae-1b5a02c80e15\") " Oct 03 15:17:34 crc kubenswrapper[4774]: I1003 15:17:34.386464 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/579db85b-3b4c-45b4-8bae-1b5a02c80e15-kube-api-access-2ppnt" (OuterVolumeSpecName: "kube-api-access-2ppnt") pod "579db85b-3b4c-45b4-8bae-1b5a02c80e15" (UID: "579db85b-3b4c-45b4-8bae-1b5a02c80e15"). InnerVolumeSpecName "kube-api-access-2ppnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:17:34 crc kubenswrapper[4774]: I1003 15:17:34.387581 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/579db85b-3b4c-45b4-8bae-1b5a02c80e15-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "579db85b-3b4c-45b4-8bae-1b5a02c80e15" (UID: "579db85b-3b4c-45b4-8bae-1b5a02c80e15"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:17:34 crc kubenswrapper[4774]: I1003 15:17:34.411507 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/579db85b-3b4c-45b4-8bae-1b5a02c80e15-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "579db85b-3b4c-45b4-8bae-1b5a02c80e15" (UID: "579db85b-3b4c-45b4-8bae-1b5a02c80e15"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:17:34 crc kubenswrapper[4774]: I1003 15:17:34.413206 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/579db85b-3b4c-45b4-8bae-1b5a02c80e15-inventory" (OuterVolumeSpecName: "inventory") pod "579db85b-3b4c-45b4-8bae-1b5a02c80e15" (UID: "579db85b-3b4c-45b4-8bae-1b5a02c80e15"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:17:34 crc kubenswrapper[4774]: I1003 15:17:34.424827 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/579db85b-3b4c-45b4-8bae-1b5a02c80e15-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "579db85b-3b4c-45b4-8bae-1b5a02c80e15" (UID: "579db85b-3b4c-45b4-8bae-1b5a02c80e15"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:17:34 crc kubenswrapper[4774]: I1003 15:17:34.485357 4774 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/579db85b-3b4c-45b4-8bae-1b5a02c80e15-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 15:17:34 crc kubenswrapper[4774]: I1003 15:17:34.485437 4774 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/579db85b-3b4c-45b4-8bae-1b5a02c80e15-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 15:17:34 crc kubenswrapper[4774]: I1003 15:17:34.485460 4774 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/579db85b-3b4c-45b4-8bae-1b5a02c80e15-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 15:17:34 crc kubenswrapper[4774]: I1003 15:17:34.485482 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ppnt\" (UniqueName: \"kubernetes.io/projected/579db85b-3b4c-45b4-8bae-1b5a02c80e15-kube-api-access-2ppnt\") on node \"crc\" DevicePath \"\"" Oct 03 15:17:34 crc kubenswrapper[4774]: I1003 15:17:34.485502 4774 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/579db85b-3b4c-45b4-8bae-1b5a02c80e15-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:17:34 crc kubenswrapper[4774]: I1003 15:17:34.833725 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45z47" event={"ID":"579db85b-3b4c-45b4-8bae-1b5a02c80e15","Type":"ContainerDied","Data":"23571a3b7df12dcbe5a06e820ce4bcbc7a6933bb6a8b8f5811caba7539d14444"} Oct 03 15:17:34 crc kubenswrapper[4774]: I1003 15:17:34.833776 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23571a3b7df12dcbe5a06e820ce4bcbc7a6933bb6a8b8f5811caba7539d14444" Oct 03 15:17:34 crc kubenswrapper[4774]: I1003 15:17:34.833876 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45z47" Oct 03 15:17:34 crc kubenswrapper[4774]: I1003 15:17:34.983712 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6"] Oct 03 15:17:34 crc kubenswrapper[4774]: E1003 15:17:34.984492 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="579db85b-3b4c-45b4-8bae-1b5a02c80e15" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 03 15:17:34 crc kubenswrapper[4774]: I1003 15:17:34.984526 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="579db85b-3b4c-45b4-8bae-1b5a02c80e15" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 03 15:17:34 crc kubenswrapper[4774]: I1003 15:17:34.984851 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="579db85b-3b4c-45b4-8bae-1b5a02c80e15" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 03 15:17:34 crc kubenswrapper[4774]: I1003 15:17:34.986028 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6" Oct 03 15:17:34 crc kubenswrapper[4774]: I1003 15:17:34.990833 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7bdzq" Oct 03 15:17:34 crc kubenswrapper[4774]: I1003 15:17:34.991021 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 03 15:17:34 crc kubenswrapper[4774]: I1003 15:17:34.995803 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 15:17:34 crc kubenswrapper[4774]: I1003 15:17:34.996767 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 15:17:34 crc kubenswrapper[4774]: I1003 15:17:34.997170 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 03 15:17:34 crc kubenswrapper[4774]: I1003 15:17:34.997278 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 15:17:35 crc kubenswrapper[4774]: I1003 15:17:35.016477 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6"] Oct 03 15:17:35 crc kubenswrapper[4774]: I1003 15:17:35.095862 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/62582218-c190-4edd-8539-5ca8e8d348e3-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6\" (UID: \"62582218-c190-4edd-8539-5ca8e8d348e3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6" Oct 03 15:17:35 crc kubenswrapper[4774]: I1003 15:17:35.096001 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/62582218-c190-4edd-8539-5ca8e8d348e3-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6\" (UID: \"62582218-c190-4edd-8539-5ca8e8d348e3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6" Oct 03 15:17:35 crc kubenswrapper[4774]: I1003 15:17:35.096038 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62582218-c190-4edd-8539-5ca8e8d348e3-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6\" (UID: \"62582218-c190-4edd-8539-5ca8e8d348e3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6" Oct 03 15:17:35 crc kubenswrapper[4774]: I1003 15:17:35.096065 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62582218-c190-4edd-8539-5ca8e8d348e3-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6\" (UID: \"62582218-c190-4edd-8539-5ca8e8d348e3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6" Oct 03 15:17:35 crc kubenswrapper[4774]: I1003 15:17:35.096124 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62582218-c190-4edd-8539-5ca8e8d348e3-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6\" (UID: \"62582218-c190-4edd-8539-5ca8e8d348e3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6" Oct 03 15:17:35 crc kubenswrapper[4774]: I1003 15:17:35.096155 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsfbq\" (UniqueName: \"kubernetes.io/projected/62582218-c190-4edd-8539-5ca8e8d348e3-kube-api-access-dsfbq\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6\" (UID: \"62582218-c190-4edd-8539-5ca8e8d348e3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6" Oct 03 15:17:35 crc kubenswrapper[4774]: I1003 15:17:35.197469 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/62582218-c190-4edd-8539-5ca8e8d348e3-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6\" (UID: \"62582218-c190-4edd-8539-5ca8e8d348e3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6" Oct 03 15:17:35 crc kubenswrapper[4774]: I1003 15:17:35.197522 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62582218-c190-4edd-8539-5ca8e8d348e3-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6\" (UID: \"62582218-c190-4edd-8539-5ca8e8d348e3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6" Oct 03 15:17:35 crc kubenswrapper[4774]: I1003 15:17:35.197551 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62582218-c190-4edd-8539-5ca8e8d348e3-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6\" (UID: \"62582218-c190-4edd-8539-5ca8e8d348e3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6" Oct 03 15:17:35 crc kubenswrapper[4774]: I1003 15:17:35.197614 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62582218-c190-4edd-8539-5ca8e8d348e3-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6\" (UID: \"62582218-c190-4edd-8539-5ca8e8d348e3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6" Oct 03 15:17:35 crc kubenswrapper[4774]: I1003 15:17:35.197752 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsfbq\" (UniqueName: \"kubernetes.io/projected/62582218-c190-4edd-8539-5ca8e8d348e3-kube-api-access-dsfbq\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6\" (UID: \"62582218-c190-4edd-8539-5ca8e8d348e3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6" Oct 03 15:17:35 crc kubenswrapper[4774]: I1003 15:17:35.197877 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/62582218-c190-4edd-8539-5ca8e8d348e3-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6\" (UID: \"62582218-c190-4edd-8539-5ca8e8d348e3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6" Oct 03 15:17:35 crc kubenswrapper[4774]: I1003 15:17:35.204884 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/62582218-c190-4edd-8539-5ca8e8d348e3-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6\" (UID: \"62582218-c190-4edd-8539-5ca8e8d348e3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6" Oct 03 15:17:35 crc kubenswrapper[4774]: I1003 15:17:35.206679 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62582218-c190-4edd-8539-5ca8e8d348e3-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6\" (UID: \"62582218-c190-4edd-8539-5ca8e8d348e3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6" Oct 03 15:17:35 crc kubenswrapper[4774]: I1003 15:17:35.207034 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62582218-c190-4edd-8539-5ca8e8d348e3-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6\" (UID: \"62582218-c190-4edd-8539-5ca8e8d348e3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6" Oct 03 15:17:35 crc kubenswrapper[4774]: I1003 15:17:35.214905 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62582218-c190-4edd-8539-5ca8e8d348e3-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6\" (UID: \"62582218-c190-4edd-8539-5ca8e8d348e3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6" Oct 03 15:17:35 crc kubenswrapper[4774]: I1003 15:17:35.221150 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/62582218-c190-4edd-8539-5ca8e8d348e3-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6\" (UID: \"62582218-c190-4edd-8539-5ca8e8d348e3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6" Oct 03 15:17:35 crc kubenswrapper[4774]: I1003 15:17:35.229110 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsfbq\" (UniqueName: \"kubernetes.io/projected/62582218-c190-4edd-8539-5ca8e8d348e3-kube-api-access-dsfbq\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6\" (UID: \"62582218-c190-4edd-8539-5ca8e8d348e3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6" Oct 03 15:17:35 crc kubenswrapper[4774]: I1003 15:17:35.314358 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6" Oct 03 15:17:35 crc kubenswrapper[4774]: I1003 15:17:35.878241 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6"] Oct 03 15:17:35 crc kubenswrapper[4774]: W1003 15:17:35.880427 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62582218_c190_4edd_8539_5ca8e8d348e3.slice/crio-63b341da1ace800ba9555a474a1316562fa99a215b109c4caec227564f2a488f WatchSource:0}: Error finding container 63b341da1ace800ba9555a474a1316562fa99a215b109c4caec227564f2a488f: Status 404 returned error can't find the container with id 63b341da1ace800ba9555a474a1316562fa99a215b109c4caec227564f2a488f Oct 03 15:17:36 crc kubenswrapper[4774]: I1003 15:17:36.856495 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6" event={"ID":"62582218-c190-4edd-8539-5ca8e8d348e3","Type":"ContainerStarted","Data":"63b341da1ace800ba9555a474a1316562fa99a215b109c4caec227564f2a488f"} Oct 03 15:17:37 crc kubenswrapper[4774]: I1003 15:17:37.872649 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6" event={"ID":"62582218-c190-4edd-8539-5ca8e8d348e3","Type":"ContainerStarted","Data":"87de06fc3651bead344242b3609142ed8f8c6080ff4d41c735abf29c800ed59c"} Oct 03 15:17:37 crc kubenswrapper[4774]: I1003 15:17:37.903653 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6" podStartSLOduration=2.910150285 podStartE2EDuration="3.9036245s" podCreationTimestamp="2025-10-03 15:17:34 +0000 UTC" firstStartedPulling="2025-10-03 15:17:35.883338931 +0000 UTC m=+2078.472542383" lastFinishedPulling="2025-10-03 15:17:36.876813136 +0000 UTC m=+2079.466016598" observedRunningTime="2025-10-03 15:17:37.897157349 +0000 UTC m=+2080.486360831" watchObservedRunningTime="2025-10-03 15:17:37.9036245 +0000 UTC m=+2080.492827992" Oct 03 15:17:50 crc kubenswrapper[4774]: I1003 15:17:50.653418 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:17:50 crc kubenswrapper[4774]: I1003 15:17:50.654288 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:17:50 crc kubenswrapper[4774]: I1003 15:17:50.654421 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" Oct 03 15:17:50 crc kubenswrapper[4774]: I1003 15:17:50.655583 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7f89eaf64e7c7bc9ea3287549d0d73eb381c82bb1cc017882eb6cc10db2b2164"} pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 15:17:50 crc kubenswrapper[4774]: I1003 15:17:50.655702 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" containerID="cri-o://7f89eaf64e7c7bc9ea3287549d0d73eb381c82bb1cc017882eb6cc10db2b2164" gracePeriod=600 Oct 03 15:17:51 crc kubenswrapper[4774]: I1003 15:17:51.018910 4774 generic.go:334] "Generic (PLEG): container finished" podID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerID="7f89eaf64e7c7bc9ea3287549d0d73eb381c82bb1cc017882eb6cc10db2b2164" exitCode=0 Oct 03 15:17:51 crc kubenswrapper[4774]: I1003 15:17:51.019001 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" event={"ID":"ca37ac4b-f421-4198-a179-12901d36f0f5","Type":"ContainerDied","Data":"7f89eaf64e7c7bc9ea3287549d0d73eb381c82bb1cc017882eb6cc10db2b2164"} Oct 03 15:17:51 crc kubenswrapper[4774]: I1003 15:17:51.019388 4774 scope.go:117] "RemoveContainer" containerID="74d4d4fd29e66b3f46bed47cb8482d6e253bf183444fa990f8669734114e0846" Oct 03 15:17:52 crc kubenswrapper[4774]: I1003 15:17:52.034959 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" event={"ID":"ca37ac4b-f421-4198-a179-12901d36f0f5","Type":"ContainerStarted","Data":"44e7bada9aed42aeacc03e1583cbd14ca821069273bbd22b595b5469bf0384a4"} Oct 03 15:18:24 crc kubenswrapper[4774]: I1003 15:18:24.382409 4774 generic.go:334] "Generic (PLEG): container finished" podID="62582218-c190-4edd-8539-5ca8e8d348e3" containerID="87de06fc3651bead344242b3609142ed8f8c6080ff4d41c735abf29c800ed59c" exitCode=0 Oct 03 15:18:24 crc kubenswrapper[4774]: I1003 15:18:24.382516 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6" event={"ID":"62582218-c190-4edd-8539-5ca8e8d348e3","Type":"ContainerDied","Data":"87de06fc3651bead344242b3609142ed8f8c6080ff4d41c735abf29c800ed59c"} Oct 03 15:18:25 crc kubenswrapper[4774]: I1003 15:18:25.837630 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6" Oct 03 15:18:25 crc kubenswrapper[4774]: I1003 15:18:25.938008 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62582218-c190-4edd-8539-5ca8e8d348e3-ssh-key\") pod \"62582218-c190-4edd-8539-5ca8e8d348e3\" (UID: \"62582218-c190-4edd-8539-5ca8e8d348e3\") " Oct 03 15:18:25 crc kubenswrapper[4774]: I1003 15:18:25.938083 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/62582218-c190-4edd-8539-5ca8e8d348e3-neutron-ovn-metadata-agent-neutron-config-0\") pod \"62582218-c190-4edd-8539-5ca8e8d348e3\" (UID: \"62582218-c190-4edd-8539-5ca8e8d348e3\") " Oct 03 15:18:25 crc kubenswrapper[4774]: I1003 15:18:25.938179 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62582218-c190-4edd-8539-5ca8e8d348e3-neutron-metadata-combined-ca-bundle\") pod \"62582218-c190-4edd-8539-5ca8e8d348e3\" (UID: \"62582218-c190-4edd-8539-5ca8e8d348e3\") " Oct 03 15:18:25 crc kubenswrapper[4774]: I1003 15:18:25.938249 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62582218-c190-4edd-8539-5ca8e8d348e3-inventory\") pod \"62582218-c190-4edd-8539-5ca8e8d348e3\" (UID: \"62582218-c190-4edd-8539-5ca8e8d348e3\") " Oct 03 15:18:25 crc kubenswrapper[4774]: I1003 15:18:25.938303 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsfbq\" (UniqueName: \"kubernetes.io/projected/62582218-c190-4edd-8539-5ca8e8d348e3-kube-api-access-dsfbq\") pod \"62582218-c190-4edd-8539-5ca8e8d348e3\" (UID: \"62582218-c190-4edd-8539-5ca8e8d348e3\") " Oct 03 15:18:25 crc kubenswrapper[4774]: I1003 15:18:25.938345 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/62582218-c190-4edd-8539-5ca8e8d348e3-nova-metadata-neutron-config-0\") pod \"62582218-c190-4edd-8539-5ca8e8d348e3\" (UID: \"62582218-c190-4edd-8539-5ca8e8d348e3\") " Oct 03 15:18:25 crc kubenswrapper[4774]: I1003 15:18:25.944472 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62582218-c190-4edd-8539-5ca8e8d348e3-kube-api-access-dsfbq" (OuterVolumeSpecName: "kube-api-access-dsfbq") pod "62582218-c190-4edd-8539-5ca8e8d348e3" (UID: "62582218-c190-4edd-8539-5ca8e8d348e3"). InnerVolumeSpecName "kube-api-access-dsfbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:18:25 crc kubenswrapper[4774]: I1003 15:18:25.957538 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62582218-c190-4edd-8539-5ca8e8d348e3-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "62582218-c190-4edd-8539-5ca8e8d348e3" (UID: "62582218-c190-4edd-8539-5ca8e8d348e3"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:18:25 crc kubenswrapper[4774]: I1003 15:18:25.997163 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62582218-c190-4edd-8539-5ca8e8d348e3-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "62582218-c190-4edd-8539-5ca8e8d348e3" (UID: "62582218-c190-4edd-8539-5ca8e8d348e3"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:18:25 crc kubenswrapper[4774]: I1003 15:18:25.997249 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62582218-c190-4edd-8539-5ca8e8d348e3-inventory" (OuterVolumeSpecName: "inventory") pod "62582218-c190-4edd-8539-5ca8e8d348e3" (UID: "62582218-c190-4edd-8539-5ca8e8d348e3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:18:25 crc kubenswrapper[4774]: I1003 15:18:25.997346 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62582218-c190-4edd-8539-5ca8e8d348e3-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "62582218-c190-4edd-8539-5ca8e8d348e3" (UID: "62582218-c190-4edd-8539-5ca8e8d348e3"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.000886 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62582218-c190-4edd-8539-5ca8e8d348e3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "62582218-c190-4edd-8539-5ca8e8d348e3" (UID: "62582218-c190-4edd-8539-5ca8e8d348e3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.040356 4774 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/62582218-c190-4edd-8539-5ca8e8d348e3-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.040400 4774 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62582218-c190-4edd-8539-5ca8e8d348e3-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.040412 4774 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/62582218-c190-4edd-8539-5ca8e8d348e3-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.040423 4774 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62582218-c190-4edd-8539-5ca8e8d348e3-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.040432 4774 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62582218-c190-4edd-8539-5ca8e8d348e3-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.040441 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsfbq\" (UniqueName: \"kubernetes.io/projected/62582218-c190-4edd-8539-5ca8e8d348e3-kube-api-access-dsfbq\") on node \"crc\" DevicePath \"\"" Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.400419 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6" event={"ID":"62582218-c190-4edd-8539-5ca8e8d348e3","Type":"ContainerDied","Data":"63b341da1ace800ba9555a474a1316562fa99a215b109c4caec227564f2a488f"} Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.400464 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6" Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.400467 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63b341da1ace800ba9555a474a1316562fa99a215b109c4caec227564f2a488f" Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.617638 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2"] Oct 03 15:18:26 crc kubenswrapper[4774]: E1003 15:18:26.618105 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62582218-c190-4edd-8539-5ca8e8d348e3" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.618126 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="62582218-c190-4edd-8539-5ca8e8d348e3" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.618345 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="62582218-c190-4edd-8539-5ca8e8d348e3" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.619121 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2" Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.621815 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7bdzq" Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.622836 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.623343 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.624069 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.624075 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.632898 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2"] Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.771709 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41e52a3d-812e-4067-a30e-e9f4ad329411-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2\" (UID: \"41e52a3d-812e-4067-a30e-e9f4ad329411\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2" Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.772350 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/41e52a3d-812e-4067-a30e-e9f4ad329411-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2\" (UID: \"41e52a3d-812e-4067-a30e-e9f4ad329411\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2" Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.772509 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfcwz\" (UniqueName: \"kubernetes.io/projected/41e52a3d-812e-4067-a30e-e9f4ad329411-kube-api-access-dfcwz\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2\" (UID: \"41e52a3d-812e-4067-a30e-e9f4ad329411\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2" Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.772704 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41e52a3d-812e-4067-a30e-e9f4ad329411-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2\" (UID: \"41e52a3d-812e-4067-a30e-e9f4ad329411\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2" Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.772772 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/41e52a3d-812e-4067-a30e-e9f4ad329411-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2\" (UID: \"41e52a3d-812e-4067-a30e-e9f4ad329411\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2" Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.874604 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfcwz\" (UniqueName: \"kubernetes.io/projected/41e52a3d-812e-4067-a30e-e9f4ad329411-kube-api-access-dfcwz\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2\" (UID: \"41e52a3d-812e-4067-a30e-e9f4ad329411\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2" Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.874803 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41e52a3d-812e-4067-a30e-e9f4ad329411-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2\" (UID: \"41e52a3d-812e-4067-a30e-e9f4ad329411\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2" Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.874881 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/41e52a3d-812e-4067-a30e-e9f4ad329411-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2\" (UID: \"41e52a3d-812e-4067-a30e-e9f4ad329411\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2" Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.875039 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41e52a3d-812e-4067-a30e-e9f4ad329411-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2\" (UID: \"41e52a3d-812e-4067-a30e-e9f4ad329411\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2" Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.875161 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/41e52a3d-812e-4067-a30e-e9f4ad329411-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2\" (UID: \"41e52a3d-812e-4067-a30e-e9f4ad329411\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2" Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.879797 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/41e52a3d-812e-4067-a30e-e9f4ad329411-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2\" (UID: \"41e52a3d-812e-4067-a30e-e9f4ad329411\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2" Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.880136 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/41e52a3d-812e-4067-a30e-e9f4ad329411-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2\" (UID: \"41e52a3d-812e-4067-a30e-e9f4ad329411\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2" Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.880275 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41e52a3d-812e-4067-a30e-e9f4ad329411-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2\" (UID: \"41e52a3d-812e-4067-a30e-e9f4ad329411\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2" Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.880623 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41e52a3d-812e-4067-a30e-e9f4ad329411-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2\" (UID: \"41e52a3d-812e-4067-a30e-e9f4ad329411\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2" Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.906956 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfcwz\" (UniqueName: \"kubernetes.io/projected/41e52a3d-812e-4067-a30e-e9f4ad329411-kube-api-access-dfcwz\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2\" (UID: \"41e52a3d-812e-4067-a30e-e9f4ad329411\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2" Oct 03 15:18:26 crc kubenswrapper[4774]: I1003 15:18:26.949489 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2" Oct 03 15:18:27 crc kubenswrapper[4774]: I1003 15:18:27.325056 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2"] Oct 03 15:18:27 crc kubenswrapper[4774]: I1003 15:18:27.412701 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2" event={"ID":"41e52a3d-812e-4067-a30e-e9f4ad329411","Type":"ContainerStarted","Data":"e4cdbd13ffec80e642034da0394c48f06f357c6e0ccf9ddf7249004af9e31491"} Oct 03 15:18:28 crc kubenswrapper[4774]: I1003 15:18:28.422285 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2" event={"ID":"41e52a3d-812e-4067-a30e-e9f4ad329411","Type":"ContainerStarted","Data":"a5a9192cc6a160eb628d2cc566e904f15e6d3aeb97c15125393c73a3aab11e36"} Oct 03 15:19:05 crc kubenswrapper[4774]: I1003 15:19:05.279106 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2" podStartSLOduration=38.861678526 podStartE2EDuration="39.279087144s" podCreationTimestamp="2025-10-03 15:18:26 +0000 UTC" firstStartedPulling="2025-10-03 15:18:27.330816466 +0000 UTC m=+2129.920019918" lastFinishedPulling="2025-10-03 15:18:27.748224964 +0000 UTC m=+2130.337428536" observedRunningTime="2025-10-03 15:18:28.444604944 +0000 UTC m=+2131.033808386" watchObservedRunningTime="2025-10-03 15:19:05.279087144 +0000 UTC m=+2167.868290586" Oct 03 15:19:05 crc kubenswrapper[4774]: I1003 15:19:05.289724 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nx6vr"] Oct 03 15:19:05 crc kubenswrapper[4774]: I1003 15:19:05.292217 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nx6vr" Oct 03 15:19:05 crc kubenswrapper[4774]: I1003 15:19:05.340600 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nx6vr"] Oct 03 15:19:05 crc kubenswrapper[4774]: I1003 15:19:05.465229 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f9ae0e1-faad-4533-a149-ce7983fa9cc1-utilities\") pod \"redhat-marketplace-nx6vr\" (UID: \"5f9ae0e1-faad-4533-a149-ce7983fa9cc1\") " pod="openshift-marketplace/redhat-marketplace-nx6vr" Oct 03 15:19:05 crc kubenswrapper[4774]: I1003 15:19:05.465308 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfnxq\" (UniqueName: \"kubernetes.io/projected/5f9ae0e1-faad-4533-a149-ce7983fa9cc1-kube-api-access-pfnxq\") pod \"redhat-marketplace-nx6vr\" (UID: \"5f9ae0e1-faad-4533-a149-ce7983fa9cc1\") " pod="openshift-marketplace/redhat-marketplace-nx6vr" Oct 03 15:19:05 crc kubenswrapper[4774]: I1003 15:19:05.465363 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f9ae0e1-faad-4533-a149-ce7983fa9cc1-catalog-content\") pod \"redhat-marketplace-nx6vr\" (UID: \"5f9ae0e1-faad-4533-a149-ce7983fa9cc1\") " pod="openshift-marketplace/redhat-marketplace-nx6vr" Oct 03 15:19:05 crc kubenswrapper[4774]: I1003 15:19:05.567827 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f9ae0e1-faad-4533-a149-ce7983fa9cc1-utilities\") pod \"redhat-marketplace-nx6vr\" (UID: \"5f9ae0e1-faad-4533-a149-ce7983fa9cc1\") " pod="openshift-marketplace/redhat-marketplace-nx6vr" Oct 03 15:19:05 crc kubenswrapper[4774]: I1003 15:19:05.567921 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfnxq\" (UniqueName: \"kubernetes.io/projected/5f9ae0e1-faad-4533-a149-ce7983fa9cc1-kube-api-access-pfnxq\") pod \"redhat-marketplace-nx6vr\" (UID: \"5f9ae0e1-faad-4533-a149-ce7983fa9cc1\") " pod="openshift-marketplace/redhat-marketplace-nx6vr" Oct 03 15:19:05 crc kubenswrapper[4774]: I1003 15:19:05.567973 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f9ae0e1-faad-4533-a149-ce7983fa9cc1-catalog-content\") pod \"redhat-marketplace-nx6vr\" (UID: \"5f9ae0e1-faad-4533-a149-ce7983fa9cc1\") " pod="openshift-marketplace/redhat-marketplace-nx6vr" Oct 03 15:19:05 crc kubenswrapper[4774]: I1003 15:19:05.568710 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f9ae0e1-faad-4533-a149-ce7983fa9cc1-catalog-content\") pod \"redhat-marketplace-nx6vr\" (UID: \"5f9ae0e1-faad-4533-a149-ce7983fa9cc1\") " pod="openshift-marketplace/redhat-marketplace-nx6vr" Oct 03 15:19:05 crc kubenswrapper[4774]: I1003 15:19:05.568709 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f9ae0e1-faad-4533-a149-ce7983fa9cc1-utilities\") pod \"redhat-marketplace-nx6vr\" (UID: \"5f9ae0e1-faad-4533-a149-ce7983fa9cc1\") " pod="openshift-marketplace/redhat-marketplace-nx6vr" Oct 03 15:19:05 crc kubenswrapper[4774]: I1003 15:19:05.592632 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfnxq\" (UniqueName: \"kubernetes.io/projected/5f9ae0e1-faad-4533-a149-ce7983fa9cc1-kube-api-access-pfnxq\") pod \"redhat-marketplace-nx6vr\" (UID: \"5f9ae0e1-faad-4533-a149-ce7983fa9cc1\") " pod="openshift-marketplace/redhat-marketplace-nx6vr" Oct 03 15:19:05 crc kubenswrapper[4774]: I1003 15:19:05.634945 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nx6vr" Oct 03 15:19:06 crc kubenswrapper[4774]: I1003 15:19:06.152316 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nx6vr"] Oct 03 15:19:06 crc kubenswrapper[4774]: I1003 15:19:06.826042 4774 generic.go:334] "Generic (PLEG): container finished" podID="5f9ae0e1-faad-4533-a149-ce7983fa9cc1" containerID="27c84793a1dfb827e9ee5efa3cd5cd16171bb8b3a38b3773b351b3c19a796de4" exitCode=0 Oct 03 15:19:06 crc kubenswrapper[4774]: I1003 15:19:06.826104 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nx6vr" event={"ID":"5f9ae0e1-faad-4533-a149-ce7983fa9cc1","Type":"ContainerDied","Data":"27c84793a1dfb827e9ee5efa3cd5cd16171bb8b3a38b3773b351b3c19a796de4"} Oct 03 15:19:06 crc kubenswrapper[4774]: I1003 15:19:06.826342 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nx6vr" event={"ID":"5f9ae0e1-faad-4533-a149-ce7983fa9cc1","Type":"ContainerStarted","Data":"a21490d9b75fd01e88820d4aa0c29cb6ad8f1fd21d86abd660f26b0ff60617da"} Oct 03 15:19:08 crc kubenswrapper[4774]: I1003 15:19:08.069984 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-chkqv"] Oct 03 15:19:08 crc kubenswrapper[4774]: I1003 15:19:08.073582 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-chkqv" Oct 03 15:19:08 crc kubenswrapper[4774]: I1003 15:19:08.090488 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-chkqv"] Oct 03 15:19:08 crc kubenswrapper[4774]: I1003 15:19:08.224447 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7357eda-0a29-4105-af4a-eb11dfcd6e37-catalog-content\") pod \"certified-operators-chkqv\" (UID: \"d7357eda-0a29-4105-af4a-eb11dfcd6e37\") " pod="openshift-marketplace/certified-operators-chkqv" Oct 03 15:19:08 crc kubenswrapper[4774]: I1003 15:19:08.224511 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz865\" (UniqueName: \"kubernetes.io/projected/d7357eda-0a29-4105-af4a-eb11dfcd6e37-kube-api-access-sz865\") pod \"certified-operators-chkqv\" (UID: \"d7357eda-0a29-4105-af4a-eb11dfcd6e37\") " pod="openshift-marketplace/certified-operators-chkqv" Oct 03 15:19:08 crc kubenswrapper[4774]: I1003 15:19:08.224565 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7357eda-0a29-4105-af4a-eb11dfcd6e37-utilities\") pod \"certified-operators-chkqv\" (UID: \"d7357eda-0a29-4105-af4a-eb11dfcd6e37\") " pod="openshift-marketplace/certified-operators-chkqv" Oct 03 15:19:08 crc kubenswrapper[4774]: I1003 15:19:08.326165 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7357eda-0a29-4105-af4a-eb11dfcd6e37-catalog-content\") pod \"certified-operators-chkqv\" (UID: \"d7357eda-0a29-4105-af4a-eb11dfcd6e37\") " pod="openshift-marketplace/certified-operators-chkqv" Oct 03 15:19:08 crc kubenswrapper[4774]: I1003 15:19:08.326228 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz865\" (UniqueName: \"kubernetes.io/projected/d7357eda-0a29-4105-af4a-eb11dfcd6e37-kube-api-access-sz865\") pod \"certified-operators-chkqv\" (UID: \"d7357eda-0a29-4105-af4a-eb11dfcd6e37\") " pod="openshift-marketplace/certified-operators-chkqv" Oct 03 15:19:08 crc kubenswrapper[4774]: I1003 15:19:08.326300 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7357eda-0a29-4105-af4a-eb11dfcd6e37-utilities\") pod \"certified-operators-chkqv\" (UID: \"d7357eda-0a29-4105-af4a-eb11dfcd6e37\") " pod="openshift-marketplace/certified-operators-chkqv" Oct 03 15:19:08 crc kubenswrapper[4774]: I1003 15:19:08.326954 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7357eda-0a29-4105-af4a-eb11dfcd6e37-catalog-content\") pod \"certified-operators-chkqv\" (UID: \"d7357eda-0a29-4105-af4a-eb11dfcd6e37\") " pod="openshift-marketplace/certified-operators-chkqv" Oct 03 15:19:08 crc kubenswrapper[4774]: I1003 15:19:08.326991 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7357eda-0a29-4105-af4a-eb11dfcd6e37-utilities\") pod \"certified-operators-chkqv\" (UID: \"d7357eda-0a29-4105-af4a-eb11dfcd6e37\") " pod="openshift-marketplace/certified-operators-chkqv" Oct 03 15:19:08 crc kubenswrapper[4774]: I1003 15:19:08.348680 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz865\" (UniqueName: \"kubernetes.io/projected/d7357eda-0a29-4105-af4a-eb11dfcd6e37-kube-api-access-sz865\") pod \"certified-operators-chkqv\" (UID: \"d7357eda-0a29-4105-af4a-eb11dfcd6e37\") " pod="openshift-marketplace/certified-operators-chkqv" Oct 03 15:19:08 crc kubenswrapper[4774]: I1003 15:19:08.417544 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-chkqv" Oct 03 15:19:08 crc kubenswrapper[4774]: I1003 15:19:08.846041 4774 generic.go:334] "Generic (PLEG): container finished" podID="5f9ae0e1-faad-4533-a149-ce7983fa9cc1" containerID="923e4ca31d0760340e866cc70dd3aeb04ddb54a63a70c17f31e76bfa3f83c10a" exitCode=0 Oct 03 15:19:08 crc kubenswrapper[4774]: I1003 15:19:08.846097 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nx6vr" event={"ID":"5f9ae0e1-faad-4533-a149-ce7983fa9cc1","Type":"ContainerDied","Data":"923e4ca31d0760340e866cc70dd3aeb04ddb54a63a70c17f31e76bfa3f83c10a"} Oct 03 15:19:08 crc kubenswrapper[4774]: W1003 15:19:08.908184 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7357eda_0a29_4105_af4a_eb11dfcd6e37.slice/crio-1f918f7a092d3454425b3771a70300e59548143775b47c8bdabdfbcc26265ba9 WatchSource:0}: Error finding container 1f918f7a092d3454425b3771a70300e59548143775b47c8bdabdfbcc26265ba9: Status 404 returned error can't find the container with id 1f918f7a092d3454425b3771a70300e59548143775b47c8bdabdfbcc26265ba9 Oct 03 15:19:08 crc kubenswrapper[4774]: I1003 15:19:08.909404 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-chkqv"] Oct 03 15:19:09 crc kubenswrapper[4774]: I1003 15:19:09.858539 4774 generic.go:334] "Generic (PLEG): container finished" podID="d7357eda-0a29-4105-af4a-eb11dfcd6e37" containerID="3b62376faab0526cc2df2168df4f39a6128346b02e0e8b9cfce70a3fa14a0833" exitCode=0 Oct 03 15:19:09 crc kubenswrapper[4774]: I1003 15:19:09.858682 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-chkqv" event={"ID":"d7357eda-0a29-4105-af4a-eb11dfcd6e37","Type":"ContainerDied","Data":"3b62376faab0526cc2df2168df4f39a6128346b02e0e8b9cfce70a3fa14a0833"} Oct 03 15:19:09 crc kubenswrapper[4774]: I1003 15:19:09.859031 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-chkqv" event={"ID":"d7357eda-0a29-4105-af4a-eb11dfcd6e37","Type":"ContainerStarted","Data":"1f918f7a092d3454425b3771a70300e59548143775b47c8bdabdfbcc26265ba9"} Oct 03 15:19:09 crc kubenswrapper[4774]: I1003 15:19:09.861345 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nx6vr" event={"ID":"5f9ae0e1-faad-4533-a149-ce7983fa9cc1","Type":"ContainerStarted","Data":"940e57db9e5015dc1d767de117c2a4b38a842b9674a4a1b0e815d586ac885604"} Oct 03 15:19:09 crc kubenswrapper[4774]: I1003 15:19:09.908518 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nx6vr" podStartSLOduration=2.336032746 podStartE2EDuration="4.908491389s" podCreationTimestamp="2025-10-03 15:19:05 +0000 UTC" firstStartedPulling="2025-10-03 15:19:06.828671199 +0000 UTC m=+2169.417874651" lastFinishedPulling="2025-10-03 15:19:09.401129832 +0000 UTC m=+2171.990333294" observedRunningTime="2025-10-03 15:19:09.901712088 +0000 UTC m=+2172.490915620" watchObservedRunningTime="2025-10-03 15:19:09.908491389 +0000 UTC m=+2172.497694881" Oct 03 15:19:11 crc kubenswrapper[4774]: I1003 15:19:11.888255 4774 generic.go:334] "Generic (PLEG): container finished" podID="d7357eda-0a29-4105-af4a-eb11dfcd6e37" containerID="53e07ae76293d0a8c6410747dffe70f014400af1e41241c3e522ea24e7e9f38d" exitCode=0 Oct 03 15:19:11 crc kubenswrapper[4774]: I1003 15:19:11.888409 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-chkqv" event={"ID":"d7357eda-0a29-4105-af4a-eb11dfcd6e37","Type":"ContainerDied","Data":"53e07ae76293d0a8c6410747dffe70f014400af1e41241c3e522ea24e7e9f38d"} Oct 03 15:19:13 crc kubenswrapper[4774]: I1003 15:19:13.913352 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-chkqv" event={"ID":"d7357eda-0a29-4105-af4a-eb11dfcd6e37","Type":"ContainerStarted","Data":"c4cf6dae4562f0161005d18c266565fbe16b25bc31314b4902dcebb37ee74dfc"} Oct 03 15:19:13 crc kubenswrapper[4774]: I1003 15:19:13.944095 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-chkqv" podStartSLOduration=2.854843535 podStartE2EDuration="5.944073363s" podCreationTimestamp="2025-10-03 15:19:08 +0000 UTC" firstStartedPulling="2025-10-03 15:19:09.861084851 +0000 UTC m=+2172.450288343" lastFinishedPulling="2025-10-03 15:19:12.950314709 +0000 UTC m=+2175.539518171" observedRunningTime="2025-10-03 15:19:13.934329637 +0000 UTC m=+2176.523533119" watchObservedRunningTime="2025-10-03 15:19:13.944073363 +0000 UTC m=+2176.533276835" Oct 03 15:19:15 crc kubenswrapper[4774]: I1003 15:19:15.635332 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nx6vr" Oct 03 15:19:15 crc kubenswrapper[4774]: I1003 15:19:15.635706 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nx6vr" Oct 03 15:19:15 crc kubenswrapper[4774]: I1003 15:19:15.715064 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nx6vr" Oct 03 15:19:15 crc kubenswrapper[4774]: I1003 15:19:15.987820 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nx6vr" Oct 03 15:19:16 crc kubenswrapper[4774]: I1003 15:19:16.854338 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nx6vr"] Oct 03 15:19:17 crc kubenswrapper[4774]: I1003 15:19:17.958804 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nx6vr" podUID="5f9ae0e1-faad-4533-a149-ce7983fa9cc1" containerName="registry-server" containerID="cri-o://940e57db9e5015dc1d767de117c2a4b38a842b9674a4a1b0e815d586ac885604" gracePeriod=2 Oct 03 15:19:18 crc kubenswrapper[4774]: I1003 15:19:18.418535 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-chkqv" Oct 03 15:19:18 crc kubenswrapper[4774]: I1003 15:19:18.418968 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-chkqv" Oct 03 15:19:18 crc kubenswrapper[4774]: I1003 15:19:18.490741 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-chkqv" Oct 03 15:19:18 crc kubenswrapper[4774]: I1003 15:19:18.972893 4774 generic.go:334] "Generic (PLEG): container finished" podID="5f9ae0e1-faad-4533-a149-ce7983fa9cc1" containerID="940e57db9e5015dc1d767de117c2a4b38a842b9674a4a1b0e815d586ac885604" exitCode=0 Oct 03 15:19:18 crc kubenswrapper[4774]: I1003 15:19:18.973524 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nx6vr" event={"ID":"5f9ae0e1-faad-4533-a149-ce7983fa9cc1","Type":"ContainerDied","Data":"940e57db9e5015dc1d767de117c2a4b38a842b9674a4a1b0e815d586ac885604"} Oct 03 15:19:18 crc kubenswrapper[4774]: I1003 15:19:18.973572 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nx6vr" event={"ID":"5f9ae0e1-faad-4533-a149-ce7983fa9cc1","Type":"ContainerDied","Data":"a21490d9b75fd01e88820d4aa0c29cb6ad8f1fd21d86abd660f26b0ff60617da"} Oct 03 15:19:18 crc kubenswrapper[4774]: I1003 15:19:18.973590 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a21490d9b75fd01e88820d4aa0c29cb6ad8f1fd21d86abd660f26b0ff60617da" Oct 03 15:19:19 crc kubenswrapper[4774]: I1003 15:19:19.007449 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nx6vr" Oct 03 15:19:19 crc kubenswrapper[4774]: I1003 15:19:19.091068 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-chkqv" Oct 03 15:19:19 crc kubenswrapper[4774]: I1003 15:19:19.171774 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfnxq\" (UniqueName: \"kubernetes.io/projected/5f9ae0e1-faad-4533-a149-ce7983fa9cc1-kube-api-access-pfnxq\") pod \"5f9ae0e1-faad-4533-a149-ce7983fa9cc1\" (UID: \"5f9ae0e1-faad-4533-a149-ce7983fa9cc1\") " Oct 03 15:19:19 crc kubenswrapper[4774]: I1003 15:19:19.172222 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f9ae0e1-faad-4533-a149-ce7983fa9cc1-utilities\") pod \"5f9ae0e1-faad-4533-a149-ce7983fa9cc1\" (UID: \"5f9ae0e1-faad-4533-a149-ce7983fa9cc1\") " Oct 03 15:19:19 crc kubenswrapper[4774]: I1003 15:19:19.172408 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f9ae0e1-faad-4533-a149-ce7983fa9cc1-catalog-content\") pod \"5f9ae0e1-faad-4533-a149-ce7983fa9cc1\" (UID: \"5f9ae0e1-faad-4533-a149-ce7983fa9cc1\") " Oct 03 15:19:19 crc kubenswrapper[4774]: I1003 15:19:19.173410 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f9ae0e1-faad-4533-a149-ce7983fa9cc1-utilities" (OuterVolumeSpecName: "utilities") pod "5f9ae0e1-faad-4533-a149-ce7983fa9cc1" (UID: "5f9ae0e1-faad-4533-a149-ce7983fa9cc1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:19:19 crc kubenswrapper[4774]: I1003 15:19:19.173704 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f9ae0e1-faad-4533-a149-ce7983fa9cc1-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 15:19:19 crc kubenswrapper[4774]: I1003 15:19:19.183720 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f9ae0e1-faad-4533-a149-ce7983fa9cc1-kube-api-access-pfnxq" (OuterVolumeSpecName: "kube-api-access-pfnxq") pod "5f9ae0e1-faad-4533-a149-ce7983fa9cc1" (UID: "5f9ae0e1-faad-4533-a149-ce7983fa9cc1"). InnerVolumeSpecName "kube-api-access-pfnxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:19:19 crc kubenswrapper[4774]: I1003 15:19:19.203097 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f9ae0e1-faad-4533-a149-ce7983fa9cc1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5f9ae0e1-faad-4533-a149-ce7983fa9cc1" (UID: "5f9ae0e1-faad-4533-a149-ce7983fa9cc1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:19:19 crc kubenswrapper[4774]: I1003 15:19:19.276302 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfnxq\" (UniqueName: \"kubernetes.io/projected/5f9ae0e1-faad-4533-a149-ce7983fa9cc1-kube-api-access-pfnxq\") on node \"crc\" DevicePath \"\"" Oct 03 15:19:19 crc kubenswrapper[4774]: I1003 15:19:19.277024 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f9ae0e1-faad-4533-a149-ce7983fa9cc1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 15:19:19 crc kubenswrapper[4774]: I1003 15:19:19.984859 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nx6vr" Oct 03 15:19:20 crc kubenswrapper[4774]: I1003 15:19:20.021514 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nx6vr"] Oct 03 15:19:20 crc kubenswrapper[4774]: I1003 15:19:20.035980 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nx6vr"] Oct 03 15:19:21 crc kubenswrapper[4774]: I1003 15:19:21.318600 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f9ae0e1-faad-4533-a149-ce7983fa9cc1" path="/var/lib/kubelet/pods/5f9ae0e1-faad-4533-a149-ce7983fa9cc1/volumes" Oct 03 15:19:21 crc kubenswrapper[4774]: I1003 15:19:21.452119 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-chkqv"] Oct 03 15:19:21 crc kubenswrapper[4774]: I1003 15:19:21.452691 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-chkqv" podUID="d7357eda-0a29-4105-af4a-eb11dfcd6e37" containerName="registry-server" containerID="cri-o://c4cf6dae4562f0161005d18c266565fbe16b25bc31314b4902dcebb37ee74dfc" gracePeriod=2 Oct 03 15:19:21 crc kubenswrapper[4774]: I1003 15:19:21.966234 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-chkqv" Oct 03 15:19:22 crc kubenswrapper[4774]: I1003 15:19:22.007180 4774 generic.go:334] "Generic (PLEG): container finished" podID="d7357eda-0a29-4105-af4a-eb11dfcd6e37" containerID="c4cf6dae4562f0161005d18c266565fbe16b25bc31314b4902dcebb37ee74dfc" exitCode=0 Oct 03 15:19:22 crc kubenswrapper[4774]: I1003 15:19:22.007441 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-chkqv" event={"ID":"d7357eda-0a29-4105-af4a-eb11dfcd6e37","Type":"ContainerDied","Data":"c4cf6dae4562f0161005d18c266565fbe16b25bc31314b4902dcebb37ee74dfc"} Oct 03 15:19:22 crc kubenswrapper[4774]: I1003 15:19:22.007525 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-chkqv" event={"ID":"d7357eda-0a29-4105-af4a-eb11dfcd6e37","Type":"ContainerDied","Data":"1f918f7a092d3454425b3771a70300e59548143775b47c8bdabdfbcc26265ba9"} Oct 03 15:19:22 crc kubenswrapper[4774]: I1003 15:19:22.007627 4774 scope.go:117] "RemoveContainer" containerID="c4cf6dae4562f0161005d18c266565fbe16b25bc31314b4902dcebb37ee74dfc" Oct 03 15:19:22 crc kubenswrapper[4774]: I1003 15:19:22.007825 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-chkqv" Oct 03 15:19:22 crc kubenswrapper[4774]: I1003 15:19:22.038800 4774 scope.go:117] "RemoveContainer" containerID="53e07ae76293d0a8c6410747dffe70f014400af1e41241c3e522ea24e7e9f38d" Oct 03 15:19:22 crc kubenswrapper[4774]: I1003 15:19:22.061335 4774 scope.go:117] "RemoveContainer" containerID="3b62376faab0526cc2df2168df4f39a6128346b02e0e8b9cfce70a3fa14a0833" Oct 03 15:19:22 crc kubenswrapper[4774]: I1003 15:19:22.108204 4774 scope.go:117] "RemoveContainer" containerID="c4cf6dae4562f0161005d18c266565fbe16b25bc31314b4902dcebb37ee74dfc" Oct 03 15:19:22 crc kubenswrapper[4774]: E1003 15:19:22.108663 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4cf6dae4562f0161005d18c266565fbe16b25bc31314b4902dcebb37ee74dfc\": container with ID starting with c4cf6dae4562f0161005d18c266565fbe16b25bc31314b4902dcebb37ee74dfc not found: ID does not exist" containerID="c4cf6dae4562f0161005d18c266565fbe16b25bc31314b4902dcebb37ee74dfc" Oct 03 15:19:22 crc kubenswrapper[4774]: I1003 15:19:22.108711 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4cf6dae4562f0161005d18c266565fbe16b25bc31314b4902dcebb37ee74dfc"} err="failed to get container status \"c4cf6dae4562f0161005d18c266565fbe16b25bc31314b4902dcebb37ee74dfc\": rpc error: code = NotFound desc = could not find container \"c4cf6dae4562f0161005d18c266565fbe16b25bc31314b4902dcebb37ee74dfc\": container with ID starting with c4cf6dae4562f0161005d18c266565fbe16b25bc31314b4902dcebb37ee74dfc not found: ID does not exist" Oct 03 15:19:22 crc kubenswrapper[4774]: I1003 15:19:22.108745 4774 scope.go:117] "RemoveContainer" containerID="53e07ae76293d0a8c6410747dffe70f014400af1e41241c3e522ea24e7e9f38d" Oct 03 15:19:22 crc kubenswrapper[4774]: E1003 15:19:22.109179 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53e07ae76293d0a8c6410747dffe70f014400af1e41241c3e522ea24e7e9f38d\": container with ID starting with 53e07ae76293d0a8c6410747dffe70f014400af1e41241c3e522ea24e7e9f38d not found: ID does not exist" containerID="53e07ae76293d0a8c6410747dffe70f014400af1e41241c3e522ea24e7e9f38d" Oct 03 15:19:22 crc kubenswrapper[4774]: I1003 15:19:22.109250 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53e07ae76293d0a8c6410747dffe70f014400af1e41241c3e522ea24e7e9f38d"} err="failed to get container status \"53e07ae76293d0a8c6410747dffe70f014400af1e41241c3e522ea24e7e9f38d\": rpc error: code = NotFound desc = could not find container \"53e07ae76293d0a8c6410747dffe70f014400af1e41241c3e522ea24e7e9f38d\": container with ID starting with 53e07ae76293d0a8c6410747dffe70f014400af1e41241c3e522ea24e7e9f38d not found: ID does not exist" Oct 03 15:19:22 crc kubenswrapper[4774]: I1003 15:19:22.109294 4774 scope.go:117] "RemoveContainer" containerID="3b62376faab0526cc2df2168df4f39a6128346b02e0e8b9cfce70a3fa14a0833" Oct 03 15:19:22 crc kubenswrapper[4774]: E1003 15:19:22.109763 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b62376faab0526cc2df2168df4f39a6128346b02e0e8b9cfce70a3fa14a0833\": container with ID starting with 3b62376faab0526cc2df2168df4f39a6128346b02e0e8b9cfce70a3fa14a0833 not found: ID does not exist" containerID="3b62376faab0526cc2df2168df4f39a6128346b02e0e8b9cfce70a3fa14a0833" Oct 03 15:19:22 crc kubenswrapper[4774]: I1003 15:19:22.109791 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b62376faab0526cc2df2168df4f39a6128346b02e0e8b9cfce70a3fa14a0833"} err="failed to get container status \"3b62376faab0526cc2df2168df4f39a6128346b02e0e8b9cfce70a3fa14a0833\": rpc error: code = NotFound desc = could not find container \"3b62376faab0526cc2df2168df4f39a6128346b02e0e8b9cfce70a3fa14a0833\": container with ID starting with 3b62376faab0526cc2df2168df4f39a6128346b02e0e8b9cfce70a3fa14a0833 not found: ID does not exist" Oct 03 15:19:22 crc kubenswrapper[4774]: I1003 15:19:22.140603 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sz865\" (UniqueName: \"kubernetes.io/projected/d7357eda-0a29-4105-af4a-eb11dfcd6e37-kube-api-access-sz865\") pod \"d7357eda-0a29-4105-af4a-eb11dfcd6e37\" (UID: \"d7357eda-0a29-4105-af4a-eb11dfcd6e37\") " Oct 03 15:19:22 crc kubenswrapper[4774]: I1003 15:19:22.141092 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7357eda-0a29-4105-af4a-eb11dfcd6e37-utilities\") pod \"d7357eda-0a29-4105-af4a-eb11dfcd6e37\" (UID: \"d7357eda-0a29-4105-af4a-eb11dfcd6e37\") " Oct 03 15:19:22 crc kubenswrapper[4774]: I1003 15:19:22.141208 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7357eda-0a29-4105-af4a-eb11dfcd6e37-catalog-content\") pod \"d7357eda-0a29-4105-af4a-eb11dfcd6e37\" (UID: \"d7357eda-0a29-4105-af4a-eb11dfcd6e37\") " Oct 03 15:19:22 crc kubenswrapper[4774]: I1003 15:19:22.141801 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7357eda-0a29-4105-af4a-eb11dfcd6e37-utilities" (OuterVolumeSpecName: "utilities") pod "d7357eda-0a29-4105-af4a-eb11dfcd6e37" (UID: "d7357eda-0a29-4105-af4a-eb11dfcd6e37"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:19:22 crc kubenswrapper[4774]: I1003 15:19:22.146422 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7357eda-0a29-4105-af4a-eb11dfcd6e37-kube-api-access-sz865" (OuterVolumeSpecName: "kube-api-access-sz865") pod "d7357eda-0a29-4105-af4a-eb11dfcd6e37" (UID: "d7357eda-0a29-4105-af4a-eb11dfcd6e37"). InnerVolumeSpecName "kube-api-access-sz865". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:19:22 crc kubenswrapper[4774]: I1003 15:19:22.185665 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7357eda-0a29-4105-af4a-eb11dfcd6e37-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d7357eda-0a29-4105-af4a-eb11dfcd6e37" (UID: "d7357eda-0a29-4105-af4a-eb11dfcd6e37"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:19:22 crc kubenswrapper[4774]: I1003 15:19:22.243943 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7357eda-0a29-4105-af4a-eb11dfcd6e37-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 15:19:22 crc kubenswrapper[4774]: I1003 15:19:22.244005 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sz865\" (UniqueName: \"kubernetes.io/projected/d7357eda-0a29-4105-af4a-eb11dfcd6e37-kube-api-access-sz865\") on node \"crc\" DevicePath \"\"" Oct 03 15:19:22 crc kubenswrapper[4774]: I1003 15:19:22.244026 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7357eda-0a29-4105-af4a-eb11dfcd6e37-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 15:19:22 crc kubenswrapper[4774]: I1003 15:19:22.362669 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-chkqv"] Oct 03 15:19:22 crc kubenswrapper[4774]: I1003 15:19:22.372971 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-chkqv"] Oct 03 15:19:23 crc kubenswrapper[4774]: I1003 15:19:23.322029 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7357eda-0a29-4105-af4a-eb11dfcd6e37" path="/var/lib/kubelet/pods/d7357eda-0a29-4105-af4a-eb11dfcd6e37/volumes" Oct 03 15:20:20 crc kubenswrapper[4774]: I1003 15:20:20.654401 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:20:20 crc kubenswrapper[4774]: I1003 15:20:20.655133 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:20:50 crc kubenswrapper[4774]: I1003 15:20:50.654080 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:20:50 crc kubenswrapper[4774]: I1003 15:20:50.654755 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:21:20 crc kubenswrapper[4774]: I1003 15:21:20.654068 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:21:20 crc kubenswrapper[4774]: I1003 15:21:20.654735 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:21:20 crc kubenswrapper[4774]: I1003 15:21:20.654787 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" Oct 03 15:21:20 crc kubenswrapper[4774]: I1003 15:21:20.655581 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"44e7bada9aed42aeacc03e1583cbd14ca821069273bbd22b595b5469bf0384a4"} pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 15:21:20 crc kubenswrapper[4774]: I1003 15:21:20.655650 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" containerID="cri-o://44e7bada9aed42aeacc03e1583cbd14ca821069273bbd22b595b5469bf0384a4" gracePeriod=600 Oct 03 15:21:20 crc kubenswrapper[4774]: E1003 15:21:20.847520 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:21:21 crc kubenswrapper[4774]: I1003 15:21:21.311808 4774 generic.go:334] "Generic (PLEG): container finished" podID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerID="44e7bada9aed42aeacc03e1583cbd14ca821069273bbd22b595b5469bf0384a4" exitCode=0 Oct 03 15:21:21 crc kubenswrapper[4774]: I1003 15:21:21.313341 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" event={"ID":"ca37ac4b-f421-4198-a179-12901d36f0f5","Type":"ContainerDied","Data":"44e7bada9aed42aeacc03e1583cbd14ca821069273bbd22b595b5469bf0384a4"} Oct 03 15:21:21 crc kubenswrapper[4774]: I1003 15:21:21.313436 4774 scope.go:117] "RemoveContainer" containerID="7f89eaf64e7c7bc9ea3287549d0d73eb381c82bb1cc017882eb6cc10db2b2164" Oct 03 15:21:21 crc kubenswrapper[4774]: I1003 15:21:21.314050 4774 scope.go:117] "RemoveContainer" containerID="44e7bada9aed42aeacc03e1583cbd14ca821069273bbd22b595b5469bf0384a4" Oct 03 15:21:21 crc kubenswrapper[4774]: E1003 15:21:21.314332 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:21:35 crc kubenswrapper[4774]: I1003 15:21:35.313595 4774 scope.go:117] "RemoveContainer" containerID="44e7bada9aed42aeacc03e1583cbd14ca821069273bbd22b595b5469bf0384a4" Oct 03 15:21:35 crc kubenswrapper[4774]: E1003 15:21:35.314481 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:21:50 crc kubenswrapper[4774]: I1003 15:21:50.300817 4774 scope.go:117] "RemoveContainer" containerID="44e7bada9aed42aeacc03e1583cbd14ca821069273bbd22b595b5469bf0384a4" Oct 03 15:21:50 crc kubenswrapper[4774]: E1003 15:21:50.302097 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:22:02 crc kubenswrapper[4774]: I1003 15:22:02.300301 4774 scope.go:117] "RemoveContainer" containerID="44e7bada9aed42aeacc03e1583cbd14ca821069273bbd22b595b5469bf0384a4" Oct 03 15:22:02 crc kubenswrapper[4774]: E1003 15:22:02.301698 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:22:14 crc kubenswrapper[4774]: I1003 15:22:14.300145 4774 scope.go:117] "RemoveContainer" containerID="44e7bada9aed42aeacc03e1583cbd14ca821069273bbd22b595b5469bf0384a4" Oct 03 15:22:14 crc kubenswrapper[4774]: E1003 15:22:14.302440 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:22:25 crc kubenswrapper[4774]: I1003 15:22:25.300060 4774 scope.go:117] "RemoveContainer" containerID="44e7bada9aed42aeacc03e1583cbd14ca821069273bbd22b595b5469bf0384a4" Oct 03 15:22:25 crc kubenswrapper[4774]: E1003 15:22:25.301043 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:22:34 crc kubenswrapper[4774]: I1003 15:22:34.098220 4774 generic.go:334] "Generic (PLEG): container finished" podID="41e52a3d-812e-4067-a30e-e9f4ad329411" containerID="a5a9192cc6a160eb628d2cc566e904f15e6d3aeb97c15125393c73a3aab11e36" exitCode=0 Oct 03 15:22:34 crc kubenswrapper[4774]: I1003 15:22:34.098318 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2" event={"ID":"41e52a3d-812e-4067-a30e-e9f4ad329411","Type":"ContainerDied","Data":"a5a9192cc6a160eb628d2cc566e904f15e6d3aeb97c15125393c73a3aab11e36"} Oct 03 15:22:35 crc kubenswrapper[4774]: I1003 15:22:35.515529 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2" Oct 03 15:22:35 crc kubenswrapper[4774]: I1003 15:22:35.684579 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41e52a3d-812e-4067-a30e-e9f4ad329411-libvirt-combined-ca-bundle\") pod \"41e52a3d-812e-4067-a30e-e9f4ad329411\" (UID: \"41e52a3d-812e-4067-a30e-e9f4ad329411\") " Oct 03 15:22:35 crc kubenswrapper[4774]: I1003 15:22:35.684647 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/41e52a3d-812e-4067-a30e-e9f4ad329411-ssh-key\") pod \"41e52a3d-812e-4067-a30e-e9f4ad329411\" (UID: \"41e52a3d-812e-4067-a30e-e9f4ad329411\") " Oct 03 15:22:35 crc kubenswrapper[4774]: I1003 15:22:35.684770 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfcwz\" (UniqueName: \"kubernetes.io/projected/41e52a3d-812e-4067-a30e-e9f4ad329411-kube-api-access-dfcwz\") pod \"41e52a3d-812e-4067-a30e-e9f4ad329411\" (UID: \"41e52a3d-812e-4067-a30e-e9f4ad329411\") " Oct 03 15:22:35 crc kubenswrapper[4774]: I1003 15:22:35.684833 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/41e52a3d-812e-4067-a30e-e9f4ad329411-libvirt-secret-0\") pod \"41e52a3d-812e-4067-a30e-e9f4ad329411\" (UID: \"41e52a3d-812e-4067-a30e-e9f4ad329411\") " Oct 03 15:22:35 crc kubenswrapper[4774]: I1003 15:22:35.684928 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41e52a3d-812e-4067-a30e-e9f4ad329411-inventory\") pod \"41e52a3d-812e-4067-a30e-e9f4ad329411\" (UID: \"41e52a3d-812e-4067-a30e-e9f4ad329411\") " Oct 03 15:22:35 crc kubenswrapper[4774]: I1003 15:22:35.690668 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41e52a3d-812e-4067-a30e-e9f4ad329411-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "41e52a3d-812e-4067-a30e-e9f4ad329411" (UID: "41e52a3d-812e-4067-a30e-e9f4ad329411"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:22:35 crc kubenswrapper[4774]: I1003 15:22:35.690729 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41e52a3d-812e-4067-a30e-e9f4ad329411-kube-api-access-dfcwz" (OuterVolumeSpecName: "kube-api-access-dfcwz") pod "41e52a3d-812e-4067-a30e-e9f4ad329411" (UID: "41e52a3d-812e-4067-a30e-e9f4ad329411"). InnerVolumeSpecName "kube-api-access-dfcwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:22:35 crc kubenswrapper[4774]: I1003 15:22:35.715644 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41e52a3d-812e-4067-a30e-e9f4ad329411-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "41e52a3d-812e-4067-a30e-e9f4ad329411" (UID: "41e52a3d-812e-4067-a30e-e9f4ad329411"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:22:35 crc kubenswrapper[4774]: I1003 15:22:35.720416 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41e52a3d-812e-4067-a30e-e9f4ad329411-inventory" (OuterVolumeSpecName: "inventory") pod "41e52a3d-812e-4067-a30e-e9f4ad329411" (UID: "41e52a3d-812e-4067-a30e-e9f4ad329411"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:22:35 crc kubenswrapper[4774]: I1003 15:22:35.740000 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41e52a3d-812e-4067-a30e-e9f4ad329411-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "41e52a3d-812e-4067-a30e-e9f4ad329411" (UID: "41e52a3d-812e-4067-a30e-e9f4ad329411"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:22:35 crc kubenswrapper[4774]: I1003 15:22:35.787478 4774 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41e52a3d-812e-4067-a30e-e9f4ad329411-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:22:35 crc kubenswrapper[4774]: I1003 15:22:35.787526 4774 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/41e52a3d-812e-4067-a30e-e9f4ad329411-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 15:22:35 crc kubenswrapper[4774]: I1003 15:22:35.787547 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfcwz\" (UniqueName: \"kubernetes.io/projected/41e52a3d-812e-4067-a30e-e9f4ad329411-kube-api-access-dfcwz\") on node \"crc\" DevicePath \"\"" Oct 03 15:22:35 crc kubenswrapper[4774]: I1003 15:22:35.787566 4774 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/41e52a3d-812e-4067-a30e-e9f4ad329411-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 03 15:22:35 crc kubenswrapper[4774]: I1003 15:22:35.787584 4774 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41e52a3d-812e-4067-a30e-e9f4ad329411-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.114908 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2" event={"ID":"41e52a3d-812e-4067-a30e-e9f4ad329411","Type":"ContainerDied","Data":"e4cdbd13ffec80e642034da0394c48f06f357c6e0ccf9ddf7249004af9e31491"} Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.114949 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4cdbd13ffec80e642034da0394c48f06f357c6e0ccf9ddf7249004af9e31491" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.114999 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.206980 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt"] Oct 03 15:22:36 crc kubenswrapper[4774]: E1003 15:22:36.207362 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7357eda-0a29-4105-af4a-eb11dfcd6e37" containerName="extract-utilities" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.209805 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7357eda-0a29-4105-af4a-eb11dfcd6e37" containerName="extract-utilities" Oct 03 15:22:36 crc kubenswrapper[4774]: E1003 15:22:36.209848 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41e52a3d-812e-4067-a30e-e9f4ad329411" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.209860 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e52a3d-812e-4067-a30e-e9f4ad329411" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 03 15:22:36 crc kubenswrapper[4774]: E1003 15:22:36.209887 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7357eda-0a29-4105-af4a-eb11dfcd6e37" containerName="extract-content" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.209895 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7357eda-0a29-4105-af4a-eb11dfcd6e37" containerName="extract-content" Oct 03 15:22:36 crc kubenswrapper[4774]: E1003 15:22:36.209917 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7357eda-0a29-4105-af4a-eb11dfcd6e37" containerName="registry-server" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.209925 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7357eda-0a29-4105-af4a-eb11dfcd6e37" containerName="registry-server" Oct 03 15:22:36 crc kubenswrapper[4774]: E1003 15:22:36.209936 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f9ae0e1-faad-4533-a149-ce7983fa9cc1" containerName="registry-server" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.209944 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f9ae0e1-faad-4533-a149-ce7983fa9cc1" containerName="registry-server" Oct 03 15:22:36 crc kubenswrapper[4774]: E1003 15:22:36.209955 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f9ae0e1-faad-4533-a149-ce7983fa9cc1" containerName="extract-content" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.209962 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f9ae0e1-faad-4533-a149-ce7983fa9cc1" containerName="extract-content" Oct 03 15:22:36 crc kubenswrapper[4774]: E1003 15:22:36.209977 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f9ae0e1-faad-4533-a149-ce7983fa9cc1" containerName="extract-utilities" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.209987 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f9ae0e1-faad-4533-a149-ce7983fa9cc1" containerName="extract-utilities" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.210274 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f9ae0e1-faad-4533-a149-ce7983fa9cc1" containerName="registry-server" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.210302 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7357eda-0a29-4105-af4a-eb11dfcd6e37" containerName="registry-server" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.210322 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="41e52a3d-812e-4067-a30e-e9f4ad329411" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.211107 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.213441 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7bdzq" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.214443 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.214444 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.214506 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.214679 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.214717 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.215320 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.224665 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt"] Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.400708 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8vrlt\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.401054 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8vrlt\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.401168 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8vrlt\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.401240 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8vrlt\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.401394 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8vrlt\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.401468 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8vrlt\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.401623 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdjxx\" (UniqueName: \"kubernetes.io/projected/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-kube-api-access-zdjxx\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8vrlt\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.401697 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8vrlt\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.401815 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8vrlt\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.503108 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdjxx\" (UniqueName: \"kubernetes.io/projected/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-kube-api-access-zdjxx\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8vrlt\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.503401 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8vrlt\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.503451 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8vrlt\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.504199 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8vrlt\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.504403 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8vrlt\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.504472 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8vrlt\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.504505 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8vrlt\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.504663 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8vrlt\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.504704 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8vrlt\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.505588 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8vrlt\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.507361 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8vrlt\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.507921 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8vrlt\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.509007 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8vrlt\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.509308 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8vrlt\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.509825 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8vrlt\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.511885 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8vrlt\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.514876 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8vrlt\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.519970 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdjxx\" (UniqueName: \"kubernetes.io/projected/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-kube-api-access-zdjxx\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8vrlt\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" Oct 03 15:22:36 crc kubenswrapper[4774]: I1003 15:22:36.530686 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" Oct 03 15:22:37 crc kubenswrapper[4774]: I1003 15:22:37.121678 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt"] Oct 03 15:22:37 crc kubenswrapper[4774]: W1003 15:22:37.123474 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3033c9b_77df_46fe_b9f6_34fedecfbdc8.slice/crio-f3aeb01c325513b897368c6e4d54d2432f2a6e79094c7d0c7fbc0b1aa4f529f3 WatchSource:0}: Error finding container f3aeb01c325513b897368c6e4d54d2432f2a6e79094c7d0c7fbc0b1aa4f529f3: Status 404 returned error can't find the container with id f3aeb01c325513b897368c6e4d54d2432f2a6e79094c7d0c7fbc0b1aa4f529f3 Oct 03 15:22:37 crc kubenswrapper[4774]: I1003 15:22:37.128883 4774 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 15:22:37 crc kubenswrapper[4774]: I1003 15:22:37.299009 4774 scope.go:117] "RemoveContainer" containerID="44e7bada9aed42aeacc03e1583cbd14ca821069273bbd22b595b5469bf0384a4" Oct 03 15:22:37 crc kubenswrapper[4774]: E1003 15:22:37.299241 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:22:38 crc kubenswrapper[4774]: I1003 15:22:38.135226 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" event={"ID":"a3033c9b-77df-46fe-b9f6-34fedecfbdc8","Type":"ContainerStarted","Data":"9ed954993d6987423c06999f220d7c173dcad05d1f61f4ffbde37cb07cd154ae"} Oct 03 15:22:38 crc kubenswrapper[4774]: I1003 15:22:38.135640 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" event={"ID":"a3033c9b-77df-46fe-b9f6-34fedecfbdc8","Type":"ContainerStarted","Data":"f3aeb01c325513b897368c6e4d54d2432f2a6e79094c7d0c7fbc0b1aa4f529f3"} Oct 03 15:22:38 crc kubenswrapper[4774]: I1003 15:22:38.159866 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" podStartSLOduration=1.553671177 podStartE2EDuration="2.159847844s" podCreationTimestamp="2025-10-03 15:22:36 +0000 UTC" firstStartedPulling="2025-10-03 15:22:37.12859524 +0000 UTC m=+2379.717798702" lastFinishedPulling="2025-10-03 15:22:37.734771917 +0000 UTC m=+2380.323975369" observedRunningTime="2025-10-03 15:22:38.151517737 +0000 UTC m=+2380.740721189" watchObservedRunningTime="2025-10-03 15:22:38.159847844 +0000 UTC m=+2380.749051296" Oct 03 15:22:51 crc kubenswrapper[4774]: I1003 15:22:51.330616 4774 scope.go:117] "RemoveContainer" containerID="44e7bada9aed42aeacc03e1583cbd14ca821069273bbd22b595b5469bf0384a4" Oct 03 15:22:51 crc kubenswrapper[4774]: E1003 15:22:51.331351 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:23:05 crc kubenswrapper[4774]: I1003 15:23:05.299777 4774 scope.go:117] "RemoveContainer" containerID="44e7bada9aed42aeacc03e1583cbd14ca821069273bbd22b595b5469bf0384a4" Oct 03 15:23:05 crc kubenswrapper[4774]: E1003 15:23:05.300556 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:23:18 crc kubenswrapper[4774]: I1003 15:23:18.299885 4774 scope.go:117] "RemoveContainer" containerID="44e7bada9aed42aeacc03e1583cbd14ca821069273bbd22b595b5469bf0384a4" Oct 03 15:23:18 crc kubenswrapper[4774]: E1003 15:23:18.300655 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:23:31 crc kubenswrapper[4774]: I1003 15:23:31.299294 4774 scope.go:117] "RemoveContainer" containerID="44e7bada9aed42aeacc03e1583cbd14ca821069273bbd22b595b5469bf0384a4" Oct 03 15:23:31 crc kubenswrapper[4774]: E1003 15:23:31.300224 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:23:45 crc kubenswrapper[4774]: I1003 15:23:45.299432 4774 scope.go:117] "RemoveContainer" containerID="44e7bada9aed42aeacc03e1583cbd14ca821069273bbd22b595b5469bf0384a4" Oct 03 15:23:45 crc kubenswrapper[4774]: E1003 15:23:45.300403 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:23:57 crc kubenswrapper[4774]: I1003 15:23:57.300182 4774 scope.go:117] "RemoveContainer" containerID="44e7bada9aed42aeacc03e1583cbd14ca821069273bbd22b595b5469bf0384a4" Oct 03 15:23:57 crc kubenswrapper[4774]: E1003 15:23:57.301669 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:24:10 crc kubenswrapper[4774]: I1003 15:24:10.300082 4774 scope.go:117] "RemoveContainer" containerID="44e7bada9aed42aeacc03e1583cbd14ca821069273bbd22b595b5469bf0384a4" Oct 03 15:24:10 crc kubenswrapper[4774]: E1003 15:24:10.301546 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:24:21 crc kubenswrapper[4774]: I1003 15:24:21.300748 4774 scope.go:117] "RemoveContainer" containerID="44e7bada9aed42aeacc03e1583cbd14ca821069273bbd22b595b5469bf0384a4" Oct 03 15:24:21 crc kubenswrapper[4774]: E1003 15:24:21.301830 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:24:35 crc kubenswrapper[4774]: I1003 15:24:35.300729 4774 scope.go:117] "RemoveContainer" containerID="44e7bada9aed42aeacc03e1583cbd14ca821069273bbd22b595b5469bf0384a4" Oct 03 15:24:35 crc kubenswrapper[4774]: E1003 15:24:35.301573 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:24:47 crc kubenswrapper[4774]: I1003 15:24:47.301113 4774 scope.go:117] "RemoveContainer" containerID="44e7bada9aed42aeacc03e1583cbd14ca821069273bbd22b595b5469bf0384a4" Oct 03 15:24:47 crc kubenswrapper[4774]: E1003 15:24:47.302509 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:25:02 crc kubenswrapper[4774]: I1003 15:25:02.299869 4774 scope.go:117] "RemoveContainer" containerID="44e7bada9aed42aeacc03e1583cbd14ca821069273bbd22b595b5469bf0384a4" Oct 03 15:25:02 crc kubenswrapper[4774]: E1003 15:25:02.302722 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:25:16 crc kubenswrapper[4774]: I1003 15:25:16.299265 4774 scope.go:117] "RemoveContainer" containerID="44e7bada9aed42aeacc03e1583cbd14ca821069273bbd22b595b5469bf0384a4" Oct 03 15:25:16 crc kubenswrapper[4774]: E1003 15:25:16.300035 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:25:26 crc kubenswrapper[4774]: I1003 15:25:26.560321 4774 scope.go:117] "RemoveContainer" containerID="940e57db9e5015dc1d767de117c2a4b38a842b9674a4a1b0e815d586ac885604" Oct 03 15:25:26 crc kubenswrapper[4774]: I1003 15:25:26.586439 4774 scope.go:117] "RemoveContainer" containerID="923e4ca31d0760340e866cc70dd3aeb04ddb54a63a70c17f31e76bfa3f83c10a" Oct 03 15:25:26 crc kubenswrapper[4774]: I1003 15:25:26.624312 4774 scope.go:117] "RemoveContainer" containerID="27c84793a1dfb827e9ee5efa3cd5cd16171bb8b3a38b3773b351b3c19a796de4" Oct 03 15:25:27 crc kubenswrapper[4774]: I1003 15:25:27.299732 4774 scope.go:117] "RemoveContainer" containerID="44e7bada9aed42aeacc03e1583cbd14ca821069273bbd22b595b5469bf0384a4" Oct 03 15:25:27 crc kubenswrapper[4774]: E1003 15:25:27.300144 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:25:40 crc kubenswrapper[4774]: I1003 15:25:40.299833 4774 scope.go:117] "RemoveContainer" containerID="44e7bada9aed42aeacc03e1583cbd14ca821069273bbd22b595b5469bf0384a4" Oct 03 15:25:40 crc kubenswrapper[4774]: E1003 15:25:40.300679 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:25:47 crc kubenswrapper[4774]: I1003 15:25:47.103957 4774 generic.go:334] "Generic (PLEG): container finished" podID="a3033c9b-77df-46fe-b9f6-34fedecfbdc8" containerID="9ed954993d6987423c06999f220d7c173dcad05d1f61f4ffbde37cb07cd154ae" exitCode=0 Oct 03 15:25:47 crc kubenswrapper[4774]: I1003 15:25:47.104060 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" event={"ID":"a3033c9b-77df-46fe-b9f6-34fedecfbdc8","Type":"ContainerDied","Data":"9ed954993d6987423c06999f220d7c173dcad05d1f61f4ffbde37cb07cd154ae"} Oct 03 15:25:48 crc kubenswrapper[4774]: I1003 15:25:48.595558 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" Oct 03 15:25:48 crc kubenswrapper[4774]: I1003 15:25:48.693508 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-migration-ssh-key-0\") pod \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " Oct 03 15:25:48 crc kubenswrapper[4774]: I1003 15:25:48.693590 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-inventory\") pod \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " Oct 03 15:25:48 crc kubenswrapper[4774]: I1003 15:25:48.693679 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-ssh-key\") pod \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " Oct 03 15:25:48 crc kubenswrapper[4774]: I1003 15:25:48.693733 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-cell1-compute-config-0\") pod \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " Oct 03 15:25:48 crc kubenswrapper[4774]: I1003 15:25:48.693757 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdjxx\" (UniqueName: \"kubernetes.io/projected/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-kube-api-access-zdjxx\") pod \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " Oct 03 15:25:48 crc kubenswrapper[4774]: I1003 15:25:48.693836 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-migration-ssh-key-1\") pod \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " Oct 03 15:25:48 crc kubenswrapper[4774]: I1003 15:25:48.693879 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-combined-ca-bundle\") pod \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " Oct 03 15:25:48 crc kubenswrapper[4774]: I1003 15:25:48.693980 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-extra-config-0\") pod \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " Oct 03 15:25:48 crc kubenswrapper[4774]: I1003 15:25:48.694007 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-cell1-compute-config-1\") pod \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\" (UID: \"a3033c9b-77df-46fe-b9f6-34fedecfbdc8\") " Oct 03 15:25:48 crc kubenswrapper[4774]: I1003 15:25:48.700351 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "a3033c9b-77df-46fe-b9f6-34fedecfbdc8" (UID: "a3033c9b-77df-46fe-b9f6-34fedecfbdc8"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:25:48 crc kubenswrapper[4774]: I1003 15:25:48.700372 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-kube-api-access-zdjxx" (OuterVolumeSpecName: "kube-api-access-zdjxx") pod "a3033c9b-77df-46fe-b9f6-34fedecfbdc8" (UID: "a3033c9b-77df-46fe-b9f6-34fedecfbdc8"). InnerVolumeSpecName "kube-api-access-zdjxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:25:48 crc kubenswrapper[4774]: I1003 15:25:48.724131 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "a3033c9b-77df-46fe-b9f6-34fedecfbdc8" (UID: "a3033c9b-77df-46fe-b9f6-34fedecfbdc8"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:25:48 crc kubenswrapper[4774]: I1003 15:25:48.726377 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "a3033c9b-77df-46fe-b9f6-34fedecfbdc8" (UID: "a3033c9b-77df-46fe-b9f6-34fedecfbdc8"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:25:48 crc kubenswrapper[4774]: I1003 15:25:48.730471 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a3033c9b-77df-46fe-b9f6-34fedecfbdc8" (UID: "a3033c9b-77df-46fe-b9f6-34fedecfbdc8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:25:48 crc kubenswrapper[4774]: I1003 15:25:48.731210 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "a3033c9b-77df-46fe-b9f6-34fedecfbdc8" (UID: "a3033c9b-77df-46fe-b9f6-34fedecfbdc8"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:25:48 crc kubenswrapper[4774]: I1003 15:25:48.731553 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-inventory" (OuterVolumeSpecName: "inventory") pod "a3033c9b-77df-46fe-b9f6-34fedecfbdc8" (UID: "a3033c9b-77df-46fe-b9f6-34fedecfbdc8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:25:48 crc kubenswrapper[4774]: I1003 15:25:48.735552 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "a3033c9b-77df-46fe-b9f6-34fedecfbdc8" (UID: "a3033c9b-77df-46fe-b9f6-34fedecfbdc8"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:25:48 crc kubenswrapper[4774]: I1003 15:25:48.741946 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "a3033c9b-77df-46fe-b9f6-34fedecfbdc8" (UID: "a3033c9b-77df-46fe-b9f6-34fedecfbdc8"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:25:48 crc kubenswrapper[4774]: I1003 15:25:48.796596 4774 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:25:48 crc kubenswrapper[4774]: I1003 15:25:48.796722 4774 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 15:25:48 crc kubenswrapper[4774]: I1003 15:25:48.796783 4774 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 03 15:25:48 crc kubenswrapper[4774]: I1003 15:25:48.796841 4774 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 03 15:25:48 crc kubenswrapper[4774]: I1003 15:25:48.796894 4774 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 15:25:48 crc kubenswrapper[4774]: I1003 15:25:48.796949 4774 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 15:25:48 crc kubenswrapper[4774]: I1003 15:25:48.797004 4774 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 15:25:48 crc kubenswrapper[4774]: I1003 15:25:48.797055 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdjxx\" (UniqueName: \"kubernetes.io/projected/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-kube-api-access-zdjxx\") on node \"crc\" DevicePath \"\"" Oct 03 15:25:48 crc kubenswrapper[4774]: I1003 15:25:48.797112 4774 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a3033c9b-77df-46fe-b9f6-34fedecfbdc8-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 03 15:25:49 crc kubenswrapper[4774]: I1003 15:25:49.123472 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" event={"ID":"a3033c9b-77df-46fe-b9f6-34fedecfbdc8","Type":"ContainerDied","Data":"f3aeb01c325513b897368c6e4d54d2432f2a6e79094c7d0c7fbc0b1aa4f529f3"} Oct 03 15:25:49 crc kubenswrapper[4774]: I1003 15:25:49.123515 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3aeb01c325513b897368c6e4d54d2432f2a6e79094c7d0c7fbc0b1aa4f529f3" Oct 03 15:25:49 crc kubenswrapper[4774]: I1003 15:25:49.123537 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8vrlt" Oct 03 15:25:49 crc kubenswrapper[4774]: I1003 15:25:49.217843 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7"] Oct 03 15:25:49 crc kubenswrapper[4774]: E1003 15:25:49.218551 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3033c9b-77df-46fe-b9f6-34fedecfbdc8" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 03 15:25:49 crc kubenswrapper[4774]: I1003 15:25:49.218573 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3033c9b-77df-46fe-b9f6-34fedecfbdc8" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 03 15:25:49 crc kubenswrapper[4774]: I1003 15:25:49.218969 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3033c9b-77df-46fe-b9f6-34fedecfbdc8" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 03 15:25:49 crc kubenswrapper[4774]: I1003 15:25:49.219962 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7" Oct 03 15:25:49 crc kubenswrapper[4774]: I1003 15:25:49.223484 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 15:25:49 crc kubenswrapper[4774]: I1003 15:25:49.223880 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 03 15:25:49 crc kubenswrapper[4774]: I1003 15:25:49.224058 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 15:25:49 crc kubenswrapper[4774]: I1003 15:25:49.224128 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7bdzq" Oct 03 15:25:49 crc kubenswrapper[4774]: I1003 15:25:49.225181 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 15:25:49 crc kubenswrapper[4774]: I1003 15:25:49.229274 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7"] Oct 03 15:25:49 crc kubenswrapper[4774]: I1003 15:25:49.408785 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cjrf\" (UniqueName: \"kubernetes.io/projected/433217d2-80d5-452b-9980-c1aaac39b5c1-kube-api-access-4cjrf\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7\" (UID: \"433217d2-80d5-452b-9980-c1aaac39b5c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7" Oct 03 15:25:49 crc kubenswrapper[4774]: I1003 15:25:49.408867 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7\" (UID: \"433217d2-80d5-452b-9980-c1aaac39b5c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7" Oct 03 15:25:49 crc kubenswrapper[4774]: I1003 15:25:49.408916 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7\" (UID: \"433217d2-80d5-452b-9980-c1aaac39b5c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7" Oct 03 15:25:49 crc kubenswrapper[4774]: I1003 15:25:49.408975 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7\" (UID: \"433217d2-80d5-452b-9980-c1aaac39b5c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7" Oct 03 15:25:49 crc kubenswrapper[4774]: I1003 15:25:49.409081 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7\" (UID: \"433217d2-80d5-452b-9980-c1aaac39b5c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7" Oct 03 15:25:49 crc kubenswrapper[4774]: I1003 15:25:49.409122 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7\" (UID: \"433217d2-80d5-452b-9980-c1aaac39b5c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7" Oct 03 15:25:49 crc kubenswrapper[4774]: I1003 15:25:49.409357 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7\" (UID: \"433217d2-80d5-452b-9980-c1aaac39b5c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7" Oct 03 15:25:49 crc kubenswrapper[4774]: I1003 15:25:49.510971 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cjrf\" (UniqueName: \"kubernetes.io/projected/433217d2-80d5-452b-9980-c1aaac39b5c1-kube-api-access-4cjrf\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7\" (UID: \"433217d2-80d5-452b-9980-c1aaac39b5c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7" Oct 03 15:25:49 crc kubenswrapper[4774]: I1003 15:25:49.511046 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7\" (UID: \"433217d2-80d5-452b-9980-c1aaac39b5c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7" Oct 03 15:25:49 crc kubenswrapper[4774]: I1003 15:25:49.511095 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7\" (UID: \"433217d2-80d5-452b-9980-c1aaac39b5c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7" Oct 03 15:25:49 crc kubenswrapper[4774]: I1003 15:25:49.511122 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7\" (UID: \"433217d2-80d5-452b-9980-c1aaac39b5c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7" Oct 03 15:25:49 crc kubenswrapper[4774]: I1003 15:25:49.511158 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7\" (UID: \"433217d2-80d5-452b-9980-c1aaac39b5c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7" Oct 03 15:25:49 crc kubenswrapper[4774]: I1003 15:25:49.511178 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7\" (UID: \"433217d2-80d5-452b-9980-c1aaac39b5c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7" Oct 03 15:25:49 crc kubenswrapper[4774]: I1003 15:25:49.511218 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7\" (UID: \"433217d2-80d5-452b-9980-c1aaac39b5c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7" Oct 03 15:25:49 crc kubenswrapper[4774]: I1003 15:25:49.516905 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7\" (UID: \"433217d2-80d5-452b-9980-c1aaac39b5c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7" Oct 03 15:25:49 crc kubenswrapper[4774]: I1003 15:25:49.518062 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7\" (UID: \"433217d2-80d5-452b-9980-c1aaac39b5c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7" Oct 03 15:25:49 crc kubenswrapper[4774]: I1003 15:25:49.518517 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7\" (UID: \"433217d2-80d5-452b-9980-c1aaac39b5c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7" Oct 03 15:25:49 crc kubenswrapper[4774]: I1003 15:25:49.518567 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7\" (UID: \"433217d2-80d5-452b-9980-c1aaac39b5c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7" Oct 03 15:25:49 crc kubenswrapper[4774]: I1003 15:25:49.518716 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7\" (UID: \"433217d2-80d5-452b-9980-c1aaac39b5c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7" Oct 03 15:25:49 crc kubenswrapper[4774]: I1003 15:25:49.519267 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7\" (UID: \"433217d2-80d5-452b-9980-c1aaac39b5c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7" Oct 03 15:25:49 crc kubenswrapper[4774]: I1003 15:25:49.546720 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cjrf\" (UniqueName: \"kubernetes.io/projected/433217d2-80d5-452b-9980-c1aaac39b5c1-kube-api-access-4cjrf\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7\" (UID: \"433217d2-80d5-452b-9980-c1aaac39b5c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7" Oct 03 15:25:49 crc kubenswrapper[4774]: I1003 15:25:49.554442 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7" Oct 03 15:25:50 crc kubenswrapper[4774]: I1003 15:25:50.043450 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7"] Oct 03 15:25:50 crc kubenswrapper[4774]: I1003 15:25:50.131055 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7" event={"ID":"433217d2-80d5-452b-9980-c1aaac39b5c1","Type":"ContainerStarted","Data":"a3011b0a501146e66c514b7ca5dfe71df781ea40661603ff566c4068ba8c85b5"} Oct 03 15:25:52 crc kubenswrapper[4774]: I1003 15:25:52.153201 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7" event={"ID":"433217d2-80d5-452b-9980-c1aaac39b5c1","Type":"ContainerStarted","Data":"268fe4456a949d7ff3aedd4baed05a64720e5ae92e47709a5c9234d29d234637"} Oct 03 15:25:55 crc kubenswrapper[4774]: I1003 15:25:55.299707 4774 scope.go:117] "RemoveContainer" containerID="44e7bada9aed42aeacc03e1583cbd14ca821069273bbd22b595b5469bf0384a4" Oct 03 15:25:55 crc kubenswrapper[4774]: E1003 15:25:55.300536 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:26:04 crc kubenswrapper[4774]: I1003 15:26:04.019527 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7" podStartSLOduration=14.199536825 podStartE2EDuration="15.019504157s" podCreationTimestamp="2025-10-03 15:25:49 +0000 UTC" firstStartedPulling="2025-10-03 15:25:50.05055064 +0000 UTC m=+2572.639754092" lastFinishedPulling="2025-10-03 15:25:50.870517972 +0000 UTC m=+2573.459721424" observedRunningTime="2025-10-03 15:25:52.174772826 +0000 UTC m=+2574.763976318" watchObservedRunningTime="2025-10-03 15:26:04.019504157 +0000 UTC m=+2586.608707609" Oct 03 15:26:04 crc kubenswrapper[4774]: I1003 15:26:04.027132 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dtj6x"] Oct 03 15:26:04 crc kubenswrapper[4774]: I1003 15:26:04.031785 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dtj6x" Oct 03 15:26:04 crc kubenswrapper[4774]: I1003 15:26:04.041801 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dtj6x"] Oct 03 15:26:04 crc kubenswrapper[4774]: I1003 15:26:04.174525 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b87865d1-131e-48db-937d-b7521258fb5b-utilities\") pod \"redhat-operators-dtj6x\" (UID: \"b87865d1-131e-48db-937d-b7521258fb5b\") " pod="openshift-marketplace/redhat-operators-dtj6x" Oct 03 15:26:04 crc kubenswrapper[4774]: I1003 15:26:04.174972 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ltm9\" (UniqueName: \"kubernetes.io/projected/b87865d1-131e-48db-937d-b7521258fb5b-kube-api-access-4ltm9\") pod \"redhat-operators-dtj6x\" (UID: \"b87865d1-131e-48db-937d-b7521258fb5b\") " pod="openshift-marketplace/redhat-operators-dtj6x" Oct 03 15:26:04 crc kubenswrapper[4774]: I1003 15:26:04.175012 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b87865d1-131e-48db-937d-b7521258fb5b-catalog-content\") pod \"redhat-operators-dtj6x\" (UID: \"b87865d1-131e-48db-937d-b7521258fb5b\") " pod="openshift-marketplace/redhat-operators-dtj6x" Oct 03 15:26:04 crc kubenswrapper[4774]: I1003 15:26:04.280674 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ltm9\" (UniqueName: \"kubernetes.io/projected/b87865d1-131e-48db-937d-b7521258fb5b-kube-api-access-4ltm9\") pod \"redhat-operators-dtj6x\" (UID: \"b87865d1-131e-48db-937d-b7521258fb5b\") " pod="openshift-marketplace/redhat-operators-dtj6x" Oct 03 15:26:04 crc kubenswrapper[4774]: I1003 15:26:04.280748 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b87865d1-131e-48db-937d-b7521258fb5b-catalog-content\") pod \"redhat-operators-dtj6x\" (UID: \"b87865d1-131e-48db-937d-b7521258fb5b\") " pod="openshift-marketplace/redhat-operators-dtj6x" Oct 03 15:26:04 crc kubenswrapper[4774]: I1003 15:26:04.280870 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b87865d1-131e-48db-937d-b7521258fb5b-utilities\") pod \"redhat-operators-dtj6x\" (UID: \"b87865d1-131e-48db-937d-b7521258fb5b\") " pod="openshift-marketplace/redhat-operators-dtj6x" Oct 03 15:26:04 crc kubenswrapper[4774]: I1003 15:26:04.281478 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b87865d1-131e-48db-937d-b7521258fb5b-utilities\") pod \"redhat-operators-dtj6x\" (UID: \"b87865d1-131e-48db-937d-b7521258fb5b\") " pod="openshift-marketplace/redhat-operators-dtj6x" Oct 03 15:26:04 crc kubenswrapper[4774]: I1003 15:26:04.282152 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b87865d1-131e-48db-937d-b7521258fb5b-catalog-content\") pod \"redhat-operators-dtj6x\" (UID: \"b87865d1-131e-48db-937d-b7521258fb5b\") " pod="openshift-marketplace/redhat-operators-dtj6x" Oct 03 15:26:04 crc kubenswrapper[4774]: I1003 15:26:04.305787 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ltm9\" (UniqueName: \"kubernetes.io/projected/b87865d1-131e-48db-937d-b7521258fb5b-kube-api-access-4ltm9\") pod \"redhat-operators-dtj6x\" (UID: \"b87865d1-131e-48db-937d-b7521258fb5b\") " pod="openshift-marketplace/redhat-operators-dtj6x" Oct 03 15:26:04 crc kubenswrapper[4774]: I1003 15:26:04.364616 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dtj6x" Oct 03 15:26:04 crc kubenswrapper[4774]: I1003 15:26:04.622803 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mz7r4"] Oct 03 15:26:04 crc kubenswrapper[4774]: I1003 15:26:04.627968 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mz7r4" Oct 03 15:26:04 crc kubenswrapper[4774]: I1003 15:26:04.640004 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mz7r4"] Oct 03 15:26:04 crc kubenswrapper[4774]: I1003 15:26:04.695402 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8994c1d-19c0-4c5b-9e97-9afc244a84b8-utilities\") pod \"community-operators-mz7r4\" (UID: \"c8994c1d-19c0-4c5b-9e97-9afc244a84b8\") " pod="openshift-marketplace/community-operators-mz7r4" Oct 03 15:26:04 crc kubenswrapper[4774]: I1003 15:26:04.695465 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8994c1d-19c0-4c5b-9e97-9afc244a84b8-catalog-content\") pod \"community-operators-mz7r4\" (UID: \"c8994c1d-19c0-4c5b-9e97-9afc244a84b8\") " pod="openshift-marketplace/community-operators-mz7r4" Oct 03 15:26:04 crc kubenswrapper[4774]: I1003 15:26:04.695582 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zddlh\" (UniqueName: \"kubernetes.io/projected/c8994c1d-19c0-4c5b-9e97-9afc244a84b8-kube-api-access-zddlh\") pod \"community-operators-mz7r4\" (UID: \"c8994c1d-19c0-4c5b-9e97-9afc244a84b8\") " pod="openshift-marketplace/community-operators-mz7r4" Oct 03 15:26:04 crc kubenswrapper[4774]: I1003 15:26:04.798027 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zddlh\" (UniqueName: \"kubernetes.io/projected/c8994c1d-19c0-4c5b-9e97-9afc244a84b8-kube-api-access-zddlh\") pod \"community-operators-mz7r4\" (UID: \"c8994c1d-19c0-4c5b-9e97-9afc244a84b8\") " pod="openshift-marketplace/community-operators-mz7r4" Oct 03 15:26:04 crc kubenswrapper[4774]: I1003 15:26:04.798193 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8994c1d-19c0-4c5b-9e97-9afc244a84b8-utilities\") pod \"community-operators-mz7r4\" (UID: \"c8994c1d-19c0-4c5b-9e97-9afc244a84b8\") " pod="openshift-marketplace/community-operators-mz7r4" Oct 03 15:26:04 crc kubenswrapper[4774]: I1003 15:26:04.798242 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8994c1d-19c0-4c5b-9e97-9afc244a84b8-catalog-content\") pod \"community-operators-mz7r4\" (UID: \"c8994c1d-19c0-4c5b-9e97-9afc244a84b8\") " pod="openshift-marketplace/community-operators-mz7r4" Oct 03 15:26:04 crc kubenswrapper[4774]: I1003 15:26:04.798815 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8994c1d-19c0-4c5b-9e97-9afc244a84b8-catalog-content\") pod \"community-operators-mz7r4\" (UID: \"c8994c1d-19c0-4c5b-9e97-9afc244a84b8\") " pod="openshift-marketplace/community-operators-mz7r4" Oct 03 15:26:04 crc kubenswrapper[4774]: I1003 15:26:04.799489 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8994c1d-19c0-4c5b-9e97-9afc244a84b8-utilities\") pod \"community-operators-mz7r4\" (UID: \"c8994c1d-19c0-4c5b-9e97-9afc244a84b8\") " pod="openshift-marketplace/community-operators-mz7r4" Oct 03 15:26:04 crc kubenswrapper[4774]: I1003 15:26:04.820963 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dtj6x"] Oct 03 15:26:04 crc kubenswrapper[4774]: I1003 15:26:04.822035 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zddlh\" (UniqueName: \"kubernetes.io/projected/c8994c1d-19c0-4c5b-9e97-9afc244a84b8-kube-api-access-zddlh\") pod \"community-operators-mz7r4\" (UID: \"c8994c1d-19c0-4c5b-9e97-9afc244a84b8\") " pod="openshift-marketplace/community-operators-mz7r4" Oct 03 15:26:04 crc kubenswrapper[4774]: I1003 15:26:04.949181 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mz7r4" Oct 03 15:26:05 crc kubenswrapper[4774]: I1003 15:26:05.294866 4774 generic.go:334] "Generic (PLEG): container finished" podID="b87865d1-131e-48db-937d-b7521258fb5b" containerID="c9b30002248052575cbed962d96809ac14a7b82715236e4416f6db34f759fec7" exitCode=0 Oct 03 15:26:05 crc kubenswrapper[4774]: I1003 15:26:05.294910 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dtj6x" event={"ID":"b87865d1-131e-48db-937d-b7521258fb5b","Type":"ContainerDied","Data":"c9b30002248052575cbed962d96809ac14a7b82715236e4416f6db34f759fec7"} Oct 03 15:26:05 crc kubenswrapper[4774]: I1003 15:26:05.294939 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dtj6x" event={"ID":"b87865d1-131e-48db-937d-b7521258fb5b","Type":"ContainerStarted","Data":"8dd2dbd56f01851030cda34c63f8a333a3739255c4bebf5c49dbc05362a86e00"} Oct 03 15:26:05 crc kubenswrapper[4774]: I1003 15:26:05.514901 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mz7r4"] Oct 03 15:26:05 crc kubenswrapper[4774]: W1003 15:26:05.525549 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8994c1d_19c0_4c5b_9e97_9afc244a84b8.slice/crio-c942d79a95bb4aa4591911a811036d89224f0ee9949f147b5613b67975de0523 WatchSource:0}: Error finding container c942d79a95bb4aa4591911a811036d89224f0ee9949f147b5613b67975de0523: Status 404 returned error can't find the container with id c942d79a95bb4aa4591911a811036d89224f0ee9949f147b5613b67975de0523 Oct 03 15:26:06 crc kubenswrapper[4774]: I1003 15:26:06.309739 4774 generic.go:334] "Generic (PLEG): container finished" podID="c8994c1d-19c0-4c5b-9e97-9afc244a84b8" containerID="b0a676aeb9084b24cca0ed8dfd1d258a0c4807b58210e97a3cca07b1272fbd39" exitCode=0 Oct 03 15:26:06 crc kubenswrapper[4774]: I1003 15:26:06.309933 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mz7r4" event={"ID":"c8994c1d-19c0-4c5b-9e97-9afc244a84b8","Type":"ContainerDied","Data":"b0a676aeb9084b24cca0ed8dfd1d258a0c4807b58210e97a3cca07b1272fbd39"} Oct 03 15:26:06 crc kubenswrapper[4774]: I1003 15:26:06.310044 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mz7r4" event={"ID":"c8994c1d-19c0-4c5b-9e97-9afc244a84b8","Type":"ContainerStarted","Data":"c942d79a95bb4aa4591911a811036d89224f0ee9949f147b5613b67975de0523"} Oct 03 15:26:07 crc kubenswrapper[4774]: I1003 15:26:07.325719 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mz7r4" event={"ID":"c8994c1d-19c0-4c5b-9e97-9afc244a84b8","Type":"ContainerStarted","Data":"3d76f4666bdd7b93f47ebf64619b2df0502e9f2882d4c072ef2b6a1d8fed87df"} Oct 03 15:26:07 crc kubenswrapper[4774]: I1003 15:26:07.330426 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dtj6x" event={"ID":"b87865d1-131e-48db-937d-b7521258fb5b","Type":"ContainerStarted","Data":"69c1b665ac66579cd0aa75afc06475c1026953067b30e5bc8e80428ac25e2d83"} Oct 03 15:26:08 crc kubenswrapper[4774]: I1003 15:26:08.344642 4774 generic.go:334] "Generic (PLEG): container finished" podID="b87865d1-131e-48db-937d-b7521258fb5b" containerID="69c1b665ac66579cd0aa75afc06475c1026953067b30e5bc8e80428ac25e2d83" exitCode=0 Oct 03 15:26:08 crc kubenswrapper[4774]: I1003 15:26:08.344748 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dtj6x" event={"ID":"b87865d1-131e-48db-937d-b7521258fb5b","Type":"ContainerDied","Data":"69c1b665ac66579cd0aa75afc06475c1026953067b30e5bc8e80428ac25e2d83"} Oct 03 15:26:08 crc kubenswrapper[4774]: I1003 15:26:08.353970 4774 generic.go:334] "Generic (PLEG): container finished" podID="c8994c1d-19c0-4c5b-9e97-9afc244a84b8" containerID="3d76f4666bdd7b93f47ebf64619b2df0502e9f2882d4c072ef2b6a1d8fed87df" exitCode=0 Oct 03 15:26:08 crc kubenswrapper[4774]: I1003 15:26:08.354010 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mz7r4" event={"ID":"c8994c1d-19c0-4c5b-9e97-9afc244a84b8","Type":"ContainerDied","Data":"3d76f4666bdd7b93f47ebf64619b2df0502e9f2882d4c072ef2b6a1d8fed87df"} Oct 03 15:26:09 crc kubenswrapper[4774]: I1003 15:26:09.363115 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mz7r4" event={"ID":"c8994c1d-19c0-4c5b-9e97-9afc244a84b8","Type":"ContainerStarted","Data":"fb7eadfc5d0de4287f48fa5f17b262aff348579479a7266d58319b9c038d57d3"} Oct 03 15:26:09 crc kubenswrapper[4774]: I1003 15:26:09.364933 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dtj6x" event={"ID":"b87865d1-131e-48db-937d-b7521258fb5b","Type":"ContainerStarted","Data":"bc9b1b57ae4603644480e020861800b7f81ca97e0ab35a85eac5f8718d8a5079"} Oct 03 15:26:09 crc kubenswrapper[4774]: I1003 15:26:09.398330 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mz7r4" podStartSLOduration=2.91681866 podStartE2EDuration="5.398300114s" podCreationTimestamp="2025-10-03 15:26:04 +0000 UTC" firstStartedPulling="2025-10-03 15:26:06.3125826 +0000 UTC m=+2588.901786052" lastFinishedPulling="2025-10-03 15:26:08.794064054 +0000 UTC m=+2591.383267506" observedRunningTime="2025-10-03 15:26:09.390732915 +0000 UTC m=+2591.979936367" watchObservedRunningTime="2025-10-03 15:26:09.398300114 +0000 UTC m=+2591.987503566" Oct 03 15:26:09 crc kubenswrapper[4774]: I1003 15:26:09.420619 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dtj6x" podStartSLOduration=1.935827116 podStartE2EDuration="5.420588179s" podCreationTimestamp="2025-10-03 15:26:04 +0000 UTC" firstStartedPulling="2025-10-03 15:26:05.296644126 +0000 UTC m=+2587.885847578" lastFinishedPulling="2025-10-03 15:26:08.781405189 +0000 UTC m=+2591.370608641" observedRunningTime="2025-10-03 15:26:09.410355364 +0000 UTC m=+2591.999558816" watchObservedRunningTime="2025-10-03 15:26:09.420588179 +0000 UTC m=+2592.009791631" Oct 03 15:26:10 crc kubenswrapper[4774]: I1003 15:26:10.300619 4774 scope.go:117] "RemoveContainer" containerID="44e7bada9aed42aeacc03e1583cbd14ca821069273bbd22b595b5469bf0384a4" Oct 03 15:26:10 crc kubenswrapper[4774]: E1003 15:26:10.301003 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:26:14 crc kubenswrapper[4774]: I1003 15:26:14.365630 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dtj6x" Oct 03 15:26:14 crc kubenswrapper[4774]: I1003 15:26:14.366258 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dtj6x" Oct 03 15:26:14 crc kubenswrapper[4774]: I1003 15:26:14.446459 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dtj6x" Oct 03 15:26:14 crc kubenswrapper[4774]: I1003 15:26:14.509221 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dtj6x" Oct 03 15:26:14 crc kubenswrapper[4774]: I1003 15:26:14.688202 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dtj6x"] Oct 03 15:26:14 crc kubenswrapper[4774]: I1003 15:26:14.950046 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mz7r4" Oct 03 15:26:14 crc kubenswrapper[4774]: I1003 15:26:14.950101 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mz7r4" Oct 03 15:26:15 crc kubenswrapper[4774]: I1003 15:26:15.013263 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mz7r4" Oct 03 15:26:15 crc kubenswrapper[4774]: I1003 15:26:15.503604 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mz7r4" Oct 03 15:26:16 crc kubenswrapper[4774]: I1003 15:26:16.432937 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dtj6x" podUID="b87865d1-131e-48db-937d-b7521258fb5b" containerName="registry-server" containerID="cri-o://bc9b1b57ae4603644480e020861800b7f81ca97e0ab35a85eac5f8718d8a5079" gracePeriod=2 Oct 03 15:26:16 crc kubenswrapper[4774]: I1003 15:26:16.898120 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dtj6x" Oct 03 15:26:16 crc kubenswrapper[4774]: I1003 15:26:16.963346 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ltm9\" (UniqueName: \"kubernetes.io/projected/b87865d1-131e-48db-937d-b7521258fb5b-kube-api-access-4ltm9\") pod \"b87865d1-131e-48db-937d-b7521258fb5b\" (UID: \"b87865d1-131e-48db-937d-b7521258fb5b\") " Oct 03 15:26:16 crc kubenswrapper[4774]: I1003 15:26:16.963445 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b87865d1-131e-48db-937d-b7521258fb5b-utilities\") pod \"b87865d1-131e-48db-937d-b7521258fb5b\" (UID: \"b87865d1-131e-48db-937d-b7521258fb5b\") " Oct 03 15:26:16 crc kubenswrapper[4774]: I1003 15:26:16.963680 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b87865d1-131e-48db-937d-b7521258fb5b-catalog-content\") pod \"b87865d1-131e-48db-937d-b7521258fb5b\" (UID: \"b87865d1-131e-48db-937d-b7521258fb5b\") " Oct 03 15:26:16 crc kubenswrapper[4774]: I1003 15:26:16.964488 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b87865d1-131e-48db-937d-b7521258fb5b-utilities" (OuterVolumeSpecName: "utilities") pod "b87865d1-131e-48db-937d-b7521258fb5b" (UID: "b87865d1-131e-48db-937d-b7521258fb5b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:26:16 crc kubenswrapper[4774]: I1003 15:26:16.968994 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b87865d1-131e-48db-937d-b7521258fb5b-kube-api-access-4ltm9" (OuterVolumeSpecName: "kube-api-access-4ltm9") pod "b87865d1-131e-48db-937d-b7521258fb5b" (UID: "b87865d1-131e-48db-937d-b7521258fb5b"). InnerVolumeSpecName "kube-api-access-4ltm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:26:17 crc kubenswrapper[4774]: I1003 15:26:17.061935 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b87865d1-131e-48db-937d-b7521258fb5b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b87865d1-131e-48db-937d-b7521258fb5b" (UID: "b87865d1-131e-48db-937d-b7521258fb5b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:26:17 crc kubenswrapper[4774]: I1003 15:26:17.065608 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b87865d1-131e-48db-937d-b7521258fb5b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 15:26:17 crc kubenswrapper[4774]: I1003 15:26:17.065651 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ltm9\" (UniqueName: \"kubernetes.io/projected/b87865d1-131e-48db-937d-b7521258fb5b-kube-api-access-4ltm9\") on node \"crc\" DevicePath \"\"" Oct 03 15:26:17 crc kubenswrapper[4774]: I1003 15:26:17.065668 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b87865d1-131e-48db-937d-b7521258fb5b-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 15:26:17 crc kubenswrapper[4774]: I1003 15:26:17.085616 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mz7r4"] Oct 03 15:26:17 crc kubenswrapper[4774]: I1003 15:26:17.447404 4774 generic.go:334] "Generic (PLEG): container finished" podID="b87865d1-131e-48db-937d-b7521258fb5b" containerID="bc9b1b57ae4603644480e020861800b7f81ca97e0ab35a85eac5f8718d8a5079" exitCode=0 Oct 03 15:26:17 crc kubenswrapper[4774]: I1003 15:26:17.447543 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dtj6x" Oct 03 15:26:17 crc kubenswrapper[4774]: I1003 15:26:17.447562 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dtj6x" event={"ID":"b87865d1-131e-48db-937d-b7521258fb5b","Type":"ContainerDied","Data":"bc9b1b57ae4603644480e020861800b7f81ca97e0ab35a85eac5f8718d8a5079"} Oct 03 15:26:17 crc kubenswrapper[4774]: I1003 15:26:17.448211 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dtj6x" event={"ID":"b87865d1-131e-48db-937d-b7521258fb5b","Type":"ContainerDied","Data":"8dd2dbd56f01851030cda34c63f8a333a3739255c4bebf5c49dbc05362a86e00"} Oct 03 15:26:17 crc kubenswrapper[4774]: I1003 15:26:17.448263 4774 scope.go:117] "RemoveContainer" containerID="bc9b1b57ae4603644480e020861800b7f81ca97e0ab35a85eac5f8718d8a5079" Oct 03 15:26:17 crc kubenswrapper[4774]: I1003 15:26:17.449063 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mz7r4" podUID="c8994c1d-19c0-4c5b-9e97-9afc244a84b8" containerName="registry-server" containerID="cri-o://fb7eadfc5d0de4287f48fa5f17b262aff348579479a7266d58319b9c038d57d3" gracePeriod=2 Oct 03 15:26:17 crc kubenswrapper[4774]: I1003 15:26:17.488974 4774 scope.go:117] "RemoveContainer" containerID="69c1b665ac66579cd0aa75afc06475c1026953067b30e5bc8e80428ac25e2d83" Oct 03 15:26:17 crc kubenswrapper[4774]: I1003 15:26:17.488988 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dtj6x"] Oct 03 15:26:17 crc kubenswrapper[4774]: I1003 15:26:17.501483 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dtj6x"] Oct 03 15:26:17 crc kubenswrapper[4774]: I1003 15:26:17.522304 4774 scope.go:117] "RemoveContainer" containerID="c9b30002248052575cbed962d96809ac14a7b82715236e4416f6db34f759fec7" Oct 03 15:26:17 crc kubenswrapper[4774]: I1003 15:26:17.569249 4774 scope.go:117] "RemoveContainer" containerID="bc9b1b57ae4603644480e020861800b7f81ca97e0ab35a85eac5f8718d8a5079" Oct 03 15:26:17 crc kubenswrapper[4774]: E1003 15:26:17.569892 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc9b1b57ae4603644480e020861800b7f81ca97e0ab35a85eac5f8718d8a5079\": container with ID starting with bc9b1b57ae4603644480e020861800b7f81ca97e0ab35a85eac5f8718d8a5079 not found: ID does not exist" containerID="bc9b1b57ae4603644480e020861800b7f81ca97e0ab35a85eac5f8718d8a5079" Oct 03 15:26:17 crc kubenswrapper[4774]: I1003 15:26:17.569948 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc9b1b57ae4603644480e020861800b7f81ca97e0ab35a85eac5f8718d8a5079"} err="failed to get container status \"bc9b1b57ae4603644480e020861800b7f81ca97e0ab35a85eac5f8718d8a5079\": rpc error: code = NotFound desc = could not find container \"bc9b1b57ae4603644480e020861800b7f81ca97e0ab35a85eac5f8718d8a5079\": container with ID starting with bc9b1b57ae4603644480e020861800b7f81ca97e0ab35a85eac5f8718d8a5079 not found: ID does not exist" Oct 03 15:26:17 crc kubenswrapper[4774]: I1003 15:26:17.569993 4774 scope.go:117] "RemoveContainer" containerID="69c1b665ac66579cd0aa75afc06475c1026953067b30e5bc8e80428ac25e2d83" Oct 03 15:26:17 crc kubenswrapper[4774]: E1003 15:26:17.570554 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69c1b665ac66579cd0aa75afc06475c1026953067b30e5bc8e80428ac25e2d83\": container with ID starting with 69c1b665ac66579cd0aa75afc06475c1026953067b30e5bc8e80428ac25e2d83 not found: ID does not exist" containerID="69c1b665ac66579cd0aa75afc06475c1026953067b30e5bc8e80428ac25e2d83" Oct 03 15:26:17 crc kubenswrapper[4774]: I1003 15:26:17.570589 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69c1b665ac66579cd0aa75afc06475c1026953067b30e5bc8e80428ac25e2d83"} err="failed to get container status \"69c1b665ac66579cd0aa75afc06475c1026953067b30e5bc8e80428ac25e2d83\": rpc error: code = NotFound desc = could not find container \"69c1b665ac66579cd0aa75afc06475c1026953067b30e5bc8e80428ac25e2d83\": container with ID starting with 69c1b665ac66579cd0aa75afc06475c1026953067b30e5bc8e80428ac25e2d83 not found: ID does not exist" Oct 03 15:26:17 crc kubenswrapper[4774]: I1003 15:26:17.570615 4774 scope.go:117] "RemoveContainer" containerID="c9b30002248052575cbed962d96809ac14a7b82715236e4416f6db34f759fec7" Oct 03 15:26:17 crc kubenswrapper[4774]: E1003 15:26:17.571243 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9b30002248052575cbed962d96809ac14a7b82715236e4416f6db34f759fec7\": container with ID starting with c9b30002248052575cbed962d96809ac14a7b82715236e4416f6db34f759fec7 not found: ID does not exist" containerID="c9b30002248052575cbed962d96809ac14a7b82715236e4416f6db34f759fec7" Oct 03 15:26:17 crc kubenswrapper[4774]: I1003 15:26:17.571279 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9b30002248052575cbed962d96809ac14a7b82715236e4416f6db34f759fec7"} err="failed to get container status \"c9b30002248052575cbed962d96809ac14a7b82715236e4416f6db34f759fec7\": rpc error: code = NotFound desc = could not find container \"c9b30002248052575cbed962d96809ac14a7b82715236e4416f6db34f759fec7\": container with ID starting with c9b30002248052575cbed962d96809ac14a7b82715236e4416f6db34f759fec7 not found: ID does not exist" Oct 03 15:26:18 crc kubenswrapper[4774]: I1003 15:26:18.461192 4774 generic.go:334] "Generic (PLEG): container finished" podID="c8994c1d-19c0-4c5b-9e97-9afc244a84b8" containerID="fb7eadfc5d0de4287f48fa5f17b262aff348579479a7266d58319b9c038d57d3" exitCode=0 Oct 03 15:26:18 crc kubenswrapper[4774]: I1003 15:26:18.461298 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mz7r4" event={"ID":"c8994c1d-19c0-4c5b-9e97-9afc244a84b8","Type":"ContainerDied","Data":"fb7eadfc5d0de4287f48fa5f17b262aff348579479a7266d58319b9c038d57d3"} Oct 03 15:26:18 crc kubenswrapper[4774]: I1003 15:26:18.870418 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mz7r4" Oct 03 15:26:19 crc kubenswrapper[4774]: I1003 15:26:19.010164 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zddlh\" (UniqueName: \"kubernetes.io/projected/c8994c1d-19c0-4c5b-9e97-9afc244a84b8-kube-api-access-zddlh\") pod \"c8994c1d-19c0-4c5b-9e97-9afc244a84b8\" (UID: \"c8994c1d-19c0-4c5b-9e97-9afc244a84b8\") " Oct 03 15:26:19 crc kubenswrapper[4774]: I1003 15:26:19.010254 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8994c1d-19c0-4c5b-9e97-9afc244a84b8-catalog-content\") pod \"c8994c1d-19c0-4c5b-9e97-9afc244a84b8\" (UID: \"c8994c1d-19c0-4c5b-9e97-9afc244a84b8\") " Oct 03 15:26:19 crc kubenswrapper[4774]: I1003 15:26:19.010354 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8994c1d-19c0-4c5b-9e97-9afc244a84b8-utilities\") pod \"c8994c1d-19c0-4c5b-9e97-9afc244a84b8\" (UID: \"c8994c1d-19c0-4c5b-9e97-9afc244a84b8\") " Oct 03 15:26:19 crc kubenswrapper[4774]: I1003 15:26:19.011540 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8994c1d-19c0-4c5b-9e97-9afc244a84b8-utilities" (OuterVolumeSpecName: "utilities") pod "c8994c1d-19c0-4c5b-9e97-9afc244a84b8" (UID: "c8994c1d-19c0-4c5b-9e97-9afc244a84b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:26:19 crc kubenswrapper[4774]: I1003 15:26:19.016242 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8994c1d-19c0-4c5b-9e97-9afc244a84b8-kube-api-access-zddlh" (OuterVolumeSpecName: "kube-api-access-zddlh") pod "c8994c1d-19c0-4c5b-9e97-9afc244a84b8" (UID: "c8994c1d-19c0-4c5b-9e97-9afc244a84b8"). InnerVolumeSpecName "kube-api-access-zddlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:26:19 crc kubenswrapper[4774]: I1003 15:26:19.089685 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8994c1d-19c0-4c5b-9e97-9afc244a84b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c8994c1d-19c0-4c5b-9e97-9afc244a84b8" (UID: "c8994c1d-19c0-4c5b-9e97-9afc244a84b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:26:19 crc kubenswrapper[4774]: I1003 15:26:19.113301 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8994c1d-19c0-4c5b-9e97-9afc244a84b8-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 15:26:19 crc kubenswrapper[4774]: I1003 15:26:19.113337 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zddlh\" (UniqueName: \"kubernetes.io/projected/c8994c1d-19c0-4c5b-9e97-9afc244a84b8-kube-api-access-zddlh\") on node \"crc\" DevicePath \"\"" Oct 03 15:26:19 crc kubenswrapper[4774]: I1003 15:26:19.113356 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8994c1d-19c0-4c5b-9e97-9afc244a84b8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 15:26:19 crc kubenswrapper[4774]: I1003 15:26:19.318145 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b87865d1-131e-48db-937d-b7521258fb5b" path="/var/lib/kubelet/pods/b87865d1-131e-48db-937d-b7521258fb5b/volumes" Oct 03 15:26:19 crc kubenswrapper[4774]: I1003 15:26:19.472564 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mz7r4" event={"ID":"c8994c1d-19c0-4c5b-9e97-9afc244a84b8","Type":"ContainerDied","Data":"c942d79a95bb4aa4591911a811036d89224f0ee9949f147b5613b67975de0523"} Oct 03 15:26:19 crc kubenswrapper[4774]: I1003 15:26:19.472608 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mz7r4" Oct 03 15:26:19 crc kubenswrapper[4774]: I1003 15:26:19.472632 4774 scope.go:117] "RemoveContainer" containerID="fb7eadfc5d0de4287f48fa5f17b262aff348579479a7266d58319b9c038d57d3" Oct 03 15:26:19 crc kubenswrapper[4774]: I1003 15:26:19.500179 4774 scope.go:117] "RemoveContainer" containerID="3d76f4666bdd7b93f47ebf64619b2df0502e9f2882d4c072ef2b6a1d8fed87df" Oct 03 15:26:19 crc kubenswrapper[4774]: I1003 15:26:19.500931 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mz7r4"] Oct 03 15:26:19 crc kubenswrapper[4774]: I1003 15:26:19.508691 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mz7r4"] Oct 03 15:26:19 crc kubenswrapper[4774]: I1003 15:26:19.520656 4774 scope.go:117] "RemoveContainer" containerID="b0a676aeb9084b24cca0ed8dfd1d258a0c4807b58210e97a3cca07b1272fbd39" Oct 03 15:26:21 crc kubenswrapper[4774]: I1003 15:26:21.318217 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8994c1d-19c0-4c5b-9e97-9afc244a84b8" path="/var/lib/kubelet/pods/c8994c1d-19c0-4c5b-9e97-9afc244a84b8/volumes" Oct 03 15:26:22 crc kubenswrapper[4774]: I1003 15:26:22.299487 4774 scope.go:117] "RemoveContainer" containerID="44e7bada9aed42aeacc03e1583cbd14ca821069273bbd22b595b5469bf0384a4" Oct 03 15:26:23 crc kubenswrapper[4774]: I1003 15:26:23.519358 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" event={"ID":"ca37ac4b-f421-4198-a179-12901d36f0f5","Type":"ContainerStarted","Data":"feff5047aec719d7bb7df138458f8ef5d1a9a2a9b949c0e51ed81eb70e458c8a"} Oct 03 15:27:03 crc kubenswrapper[4774]: I1003 15:27:03.064037 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-5c49d658df-5r5zg" podUID="4f82147b-63cd-44bc-8950-bf87fa407688" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Oct 03 15:28:17 crc kubenswrapper[4774]: I1003 15:28:17.754779 4774 generic.go:334] "Generic (PLEG): container finished" podID="433217d2-80d5-452b-9980-c1aaac39b5c1" containerID="268fe4456a949d7ff3aedd4baed05a64720e5ae92e47709a5c9234d29d234637" exitCode=0 Oct 03 15:28:17 crc kubenswrapper[4774]: I1003 15:28:17.755145 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7" event={"ID":"433217d2-80d5-452b-9980-c1aaac39b5c1","Type":"ContainerDied","Data":"268fe4456a949d7ff3aedd4baed05a64720e5ae92e47709a5c9234d29d234637"} Oct 03 15:28:19 crc kubenswrapper[4774]: I1003 15:28:19.221778 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7" Oct 03 15:28:19 crc kubenswrapper[4774]: I1003 15:28:19.323801 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-inventory\") pod \"433217d2-80d5-452b-9980-c1aaac39b5c1\" (UID: \"433217d2-80d5-452b-9980-c1aaac39b5c1\") " Oct 03 15:28:19 crc kubenswrapper[4774]: I1003 15:28:19.323886 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-ceilometer-compute-config-data-1\") pod \"433217d2-80d5-452b-9980-c1aaac39b5c1\" (UID: \"433217d2-80d5-452b-9980-c1aaac39b5c1\") " Oct 03 15:28:19 crc kubenswrapper[4774]: I1003 15:28:19.323971 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-ssh-key\") pod \"433217d2-80d5-452b-9980-c1aaac39b5c1\" (UID: \"433217d2-80d5-452b-9980-c1aaac39b5c1\") " Oct 03 15:28:19 crc kubenswrapper[4774]: I1003 15:28:19.324063 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-telemetry-combined-ca-bundle\") pod \"433217d2-80d5-452b-9980-c1aaac39b5c1\" (UID: \"433217d2-80d5-452b-9980-c1aaac39b5c1\") " Oct 03 15:28:19 crc kubenswrapper[4774]: I1003 15:28:19.324105 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-ceilometer-compute-config-data-0\") pod \"433217d2-80d5-452b-9980-c1aaac39b5c1\" (UID: \"433217d2-80d5-452b-9980-c1aaac39b5c1\") " Oct 03 15:28:19 crc kubenswrapper[4774]: I1003 15:28:19.324182 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cjrf\" (UniqueName: \"kubernetes.io/projected/433217d2-80d5-452b-9980-c1aaac39b5c1-kube-api-access-4cjrf\") pod \"433217d2-80d5-452b-9980-c1aaac39b5c1\" (UID: \"433217d2-80d5-452b-9980-c1aaac39b5c1\") " Oct 03 15:28:19 crc kubenswrapper[4774]: I1003 15:28:19.324232 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-ceilometer-compute-config-data-2\") pod \"433217d2-80d5-452b-9980-c1aaac39b5c1\" (UID: \"433217d2-80d5-452b-9980-c1aaac39b5c1\") " Oct 03 15:28:19 crc kubenswrapper[4774]: I1003 15:28:19.330459 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/433217d2-80d5-452b-9980-c1aaac39b5c1-kube-api-access-4cjrf" (OuterVolumeSpecName: "kube-api-access-4cjrf") pod "433217d2-80d5-452b-9980-c1aaac39b5c1" (UID: "433217d2-80d5-452b-9980-c1aaac39b5c1"). InnerVolumeSpecName "kube-api-access-4cjrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:28:19 crc kubenswrapper[4774]: I1003 15:28:19.330498 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "433217d2-80d5-452b-9980-c1aaac39b5c1" (UID: "433217d2-80d5-452b-9980-c1aaac39b5c1"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:28:19 crc kubenswrapper[4774]: I1003 15:28:19.354099 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-inventory" (OuterVolumeSpecName: "inventory") pod "433217d2-80d5-452b-9980-c1aaac39b5c1" (UID: "433217d2-80d5-452b-9980-c1aaac39b5c1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:28:19 crc kubenswrapper[4774]: I1003 15:28:19.360784 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "433217d2-80d5-452b-9980-c1aaac39b5c1" (UID: "433217d2-80d5-452b-9980-c1aaac39b5c1"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:28:19 crc kubenswrapper[4774]: I1003 15:28:19.361723 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "433217d2-80d5-452b-9980-c1aaac39b5c1" (UID: "433217d2-80d5-452b-9980-c1aaac39b5c1"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:28:19 crc kubenswrapper[4774]: I1003 15:28:19.369593 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "433217d2-80d5-452b-9980-c1aaac39b5c1" (UID: "433217d2-80d5-452b-9980-c1aaac39b5c1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:28:19 crc kubenswrapper[4774]: I1003 15:28:19.370046 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "433217d2-80d5-452b-9980-c1aaac39b5c1" (UID: "433217d2-80d5-452b-9980-c1aaac39b5c1"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:28:19 crc kubenswrapper[4774]: I1003 15:28:19.426972 4774 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 15:28:19 crc kubenswrapper[4774]: I1003 15:28:19.427018 4774 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 03 15:28:19 crc kubenswrapper[4774]: I1003 15:28:19.427037 4774 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 15:28:19 crc kubenswrapper[4774]: I1003 15:28:19.427049 4774 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:28:19 crc kubenswrapper[4774]: I1003 15:28:19.427060 4774 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 03 15:28:19 crc kubenswrapper[4774]: I1003 15:28:19.427074 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cjrf\" (UniqueName: \"kubernetes.io/projected/433217d2-80d5-452b-9980-c1aaac39b5c1-kube-api-access-4cjrf\") on node \"crc\" DevicePath \"\"" Oct 03 15:28:19 crc kubenswrapper[4774]: I1003 15:28:19.427085 4774 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/433217d2-80d5-452b-9980-c1aaac39b5c1-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 03 15:28:19 crc kubenswrapper[4774]: I1003 15:28:19.777578 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7" event={"ID":"433217d2-80d5-452b-9980-c1aaac39b5c1","Type":"ContainerDied","Data":"a3011b0a501146e66c514b7ca5dfe71df781ea40661603ff566c4068ba8c85b5"} Oct 03 15:28:19 crc kubenswrapper[4774]: I1003 15:28:19.777957 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3011b0a501146e66c514b7ca5dfe71df781ea40661603ff566c4068ba8c85b5" Oct 03 15:28:19 crc kubenswrapper[4774]: I1003 15:28:19.777652 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7" Oct 03 15:28:38 crc kubenswrapper[4774]: E1003 15:28:38.765704 4774 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.32:59596->38.102.83.32:40969: write tcp 38.102.83.32:59596->38.102.83.32:40969: write: broken pipe Oct 03 15:28:50 crc kubenswrapper[4774]: I1003 15:28:50.653926 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:28:50 crc kubenswrapper[4774]: I1003 15:28:50.654575 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.578177 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 03 15:29:07 crc kubenswrapper[4774]: E1003 15:29:07.579050 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b87865d1-131e-48db-937d-b7521258fb5b" containerName="extract-utilities" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.579064 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="b87865d1-131e-48db-937d-b7521258fb5b" containerName="extract-utilities" Oct 03 15:29:07 crc kubenswrapper[4774]: E1003 15:29:07.579080 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8994c1d-19c0-4c5b-9e97-9afc244a84b8" containerName="extract-content" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.579087 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8994c1d-19c0-4c5b-9e97-9afc244a84b8" containerName="extract-content" Oct 03 15:29:07 crc kubenswrapper[4774]: E1003 15:29:07.579094 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="433217d2-80d5-452b-9980-c1aaac39b5c1" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.579103 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="433217d2-80d5-452b-9980-c1aaac39b5c1" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 03 15:29:07 crc kubenswrapper[4774]: E1003 15:29:07.579114 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b87865d1-131e-48db-937d-b7521258fb5b" containerName="extract-content" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.579121 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="b87865d1-131e-48db-937d-b7521258fb5b" containerName="extract-content" Oct 03 15:29:07 crc kubenswrapper[4774]: E1003 15:29:07.579131 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8994c1d-19c0-4c5b-9e97-9afc244a84b8" containerName="registry-server" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.579137 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8994c1d-19c0-4c5b-9e97-9afc244a84b8" containerName="registry-server" Oct 03 15:29:07 crc kubenswrapper[4774]: E1003 15:29:07.579154 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b87865d1-131e-48db-937d-b7521258fb5b" containerName="registry-server" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.579159 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="b87865d1-131e-48db-937d-b7521258fb5b" containerName="registry-server" Oct 03 15:29:07 crc kubenswrapper[4774]: E1003 15:29:07.579167 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8994c1d-19c0-4c5b-9e97-9afc244a84b8" containerName="extract-utilities" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.579175 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8994c1d-19c0-4c5b-9e97-9afc244a84b8" containerName="extract-utilities" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.579434 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="433217d2-80d5-452b-9980-c1aaac39b5c1" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.579453 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8994c1d-19c0-4c5b-9e97-9afc244a84b8" containerName="registry-server" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.579477 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="b87865d1-131e-48db-937d-b7521258fb5b" containerName="registry-server" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.580206 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.582991 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.583179 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.583003 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.583076 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-pk52n" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.594089 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.663234 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " pod="openstack/tempest-tests-tempest" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.663418 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " pod="openstack/tempest-tests-tempest" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.663500 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " pod="openstack/tempest-tests-tempest" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.663564 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " pod="openstack/tempest-tests-tempest" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.663591 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " pod="openstack/tempest-tests-tempest" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.663611 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " pod="openstack/tempest-tests-tempest" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.663630 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-config-data\") pod \"tempest-tests-tempest\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " pod="openstack/tempest-tests-tempest" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.663766 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " pod="openstack/tempest-tests-tempest" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.663802 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqw5h\" (UniqueName: \"kubernetes.io/projected/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-kube-api-access-bqw5h\") pod \"tempest-tests-tempest\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " pod="openstack/tempest-tests-tempest" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.765348 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " pod="openstack/tempest-tests-tempest" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.765479 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " pod="openstack/tempest-tests-tempest" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.765537 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " pod="openstack/tempest-tests-tempest" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.765588 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " pod="openstack/tempest-tests-tempest" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.765623 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " pod="openstack/tempest-tests-tempest" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.765653 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " pod="openstack/tempest-tests-tempest" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.765686 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-config-data\") pod \"tempest-tests-tempest\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " pod="openstack/tempest-tests-tempest" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.765794 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " pod="openstack/tempest-tests-tempest" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.765843 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqw5h\" (UniqueName: \"kubernetes.io/projected/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-kube-api-access-bqw5h\") pod \"tempest-tests-tempest\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " pod="openstack/tempest-tests-tempest" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.766046 4774 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/tempest-tests-tempest" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.766054 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " pod="openstack/tempest-tests-tempest" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.767074 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " pod="openstack/tempest-tests-tempest" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.767189 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-config-data\") pod \"tempest-tests-tempest\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " pod="openstack/tempest-tests-tempest" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.767726 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " pod="openstack/tempest-tests-tempest" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.772525 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " pod="openstack/tempest-tests-tempest" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.774171 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " pod="openstack/tempest-tests-tempest" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.774859 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " pod="openstack/tempest-tests-tempest" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.792691 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqw5h\" (UniqueName: \"kubernetes.io/projected/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-kube-api-access-bqw5h\") pod \"tempest-tests-tempest\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " pod="openstack/tempest-tests-tempest" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.797190 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " pod="openstack/tempest-tests-tempest" Oct 03 15:29:07 crc kubenswrapper[4774]: I1003 15:29:07.908751 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 03 15:29:08 crc kubenswrapper[4774]: I1003 15:29:08.386712 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 03 15:29:08 crc kubenswrapper[4774]: W1003 15:29:08.393725 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b63f9fa_2194_46e4_bfe0_d7efb33f10fb.slice/crio-9ef358004b1c1acc8e28ad9a0e97a6fe55caaf6a859f8edeb9cc25f78073ded5 WatchSource:0}: Error finding container 9ef358004b1c1acc8e28ad9a0e97a6fe55caaf6a859f8edeb9cc25f78073ded5: Status 404 returned error can't find the container with id 9ef358004b1c1acc8e28ad9a0e97a6fe55caaf6a859f8edeb9cc25f78073ded5 Oct 03 15:29:08 crc kubenswrapper[4774]: I1003 15:29:08.396494 4774 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 15:29:09 crc kubenswrapper[4774]: I1003 15:29:09.296256 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb","Type":"ContainerStarted","Data":"9ef358004b1c1acc8e28ad9a0e97a6fe55caaf6a859f8edeb9cc25f78073ded5"} Oct 03 15:29:13 crc kubenswrapper[4774]: I1003 15:29:13.494460 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-447jw"] Oct 03 15:29:13 crc kubenswrapper[4774]: I1003 15:29:13.498326 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-447jw" Oct 03 15:29:13 crc kubenswrapper[4774]: I1003 15:29:13.508472 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-447jw"] Oct 03 15:29:13 crc kubenswrapper[4774]: I1003 15:29:13.601224 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/004d610c-a775-42bc-9a46-fa30fb50c03c-utilities\") pod \"certified-operators-447jw\" (UID: \"004d610c-a775-42bc-9a46-fa30fb50c03c\") " pod="openshift-marketplace/certified-operators-447jw" Oct 03 15:29:13 crc kubenswrapper[4774]: I1003 15:29:13.601430 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/004d610c-a775-42bc-9a46-fa30fb50c03c-catalog-content\") pod \"certified-operators-447jw\" (UID: \"004d610c-a775-42bc-9a46-fa30fb50c03c\") " pod="openshift-marketplace/certified-operators-447jw" Oct 03 15:29:13 crc kubenswrapper[4774]: I1003 15:29:13.601491 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76f5h\" (UniqueName: \"kubernetes.io/projected/004d610c-a775-42bc-9a46-fa30fb50c03c-kube-api-access-76f5h\") pod \"certified-operators-447jw\" (UID: \"004d610c-a775-42bc-9a46-fa30fb50c03c\") " pod="openshift-marketplace/certified-operators-447jw" Oct 03 15:29:13 crc kubenswrapper[4774]: I1003 15:29:13.702874 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/004d610c-a775-42bc-9a46-fa30fb50c03c-catalog-content\") pod \"certified-operators-447jw\" (UID: \"004d610c-a775-42bc-9a46-fa30fb50c03c\") " pod="openshift-marketplace/certified-operators-447jw" Oct 03 15:29:13 crc kubenswrapper[4774]: I1003 15:29:13.702938 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76f5h\" (UniqueName: \"kubernetes.io/projected/004d610c-a775-42bc-9a46-fa30fb50c03c-kube-api-access-76f5h\") pod \"certified-operators-447jw\" (UID: \"004d610c-a775-42bc-9a46-fa30fb50c03c\") " pod="openshift-marketplace/certified-operators-447jw" Oct 03 15:29:13 crc kubenswrapper[4774]: I1003 15:29:13.702966 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/004d610c-a775-42bc-9a46-fa30fb50c03c-utilities\") pod \"certified-operators-447jw\" (UID: \"004d610c-a775-42bc-9a46-fa30fb50c03c\") " pod="openshift-marketplace/certified-operators-447jw" Oct 03 15:29:13 crc kubenswrapper[4774]: I1003 15:29:13.703452 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/004d610c-a775-42bc-9a46-fa30fb50c03c-catalog-content\") pod \"certified-operators-447jw\" (UID: \"004d610c-a775-42bc-9a46-fa30fb50c03c\") " pod="openshift-marketplace/certified-operators-447jw" Oct 03 15:29:13 crc kubenswrapper[4774]: I1003 15:29:13.703737 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/004d610c-a775-42bc-9a46-fa30fb50c03c-utilities\") pod \"certified-operators-447jw\" (UID: \"004d610c-a775-42bc-9a46-fa30fb50c03c\") " pod="openshift-marketplace/certified-operators-447jw" Oct 03 15:29:13 crc kubenswrapper[4774]: I1003 15:29:13.725028 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76f5h\" (UniqueName: \"kubernetes.io/projected/004d610c-a775-42bc-9a46-fa30fb50c03c-kube-api-access-76f5h\") pod \"certified-operators-447jw\" (UID: \"004d610c-a775-42bc-9a46-fa30fb50c03c\") " pod="openshift-marketplace/certified-operators-447jw" Oct 03 15:29:13 crc kubenswrapper[4774]: I1003 15:29:13.842952 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-447jw" Oct 03 15:29:14 crc kubenswrapper[4774]: I1003 15:29:14.368132 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-447jw"] Oct 03 15:29:14 crc kubenswrapper[4774]: W1003 15:29:14.375277 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod004d610c_a775_42bc_9a46_fa30fb50c03c.slice/crio-d03419a9f231ed008b50add281eba463414e29e45308aac32b2919049b8c56d3 WatchSource:0}: Error finding container d03419a9f231ed008b50add281eba463414e29e45308aac32b2919049b8c56d3: Status 404 returned error can't find the container with id d03419a9f231ed008b50add281eba463414e29e45308aac32b2919049b8c56d3 Oct 03 15:29:15 crc kubenswrapper[4774]: I1003 15:29:15.349678 4774 generic.go:334] "Generic (PLEG): container finished" podID="004d610c-a775-42bc-9a46-fa30fb50c03c" containerID="cdedc8ea767656dcfbbc9999bff6e904b6f691db68545ca51bd256e5b80abc2d" exitCode=0 Oct 03 15:29:15 crc kubenswrapper[4774]: I1003 15:29:15.349725 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-447jw" event={"ID":"004d610c-a775-42bc-9a46-fa30fb50c03c","Type":"ContainerDied","Data":"cdedc8ea767656dcfbbc9999bff6e904b6f691db68545ca51bd256e5b80abc2d"} Oct 03 15:29:15 crc kubenswrapper[4774]: I1003 15:29:15.350027 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-447jw" event={"ID":"004d610c-a775-42bc-9a46-fa30fb50c03c","Type":"ContainerStarted","Data":"d03419a9f231ed008b50add281eba463414e29e45308aac32b2919049b8c56d3"} Oct 03 15:29:17 crc kubenswrapper[4774]: I1003 15:29:17.368581 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-447jw" event={"ID":"004d610c-a775-42bc-9a46-fa30fb50c03c","Type":"ContainerStarted","Data":"c6f5c79d5d874381aba410ae91a91038c1ee154da651a7787c607fd98249e0df"} Oct 03 15:29:18 crc kubenswrapper[4774]: I1003 15:29:18.378680 4774 generic.go:334] "Generic (PLEG): container finished" podID="004d610c-a775-42bc-9a46-fa30fb50c03c" containerID="c6f5c79d5d874381aba410ae91a91038c1ee154da651a7787c607fd98249e0df" exitCode=0 Oct 03 15:29:18 crc kubenswrapper[4774]: I1003 15:29:18.378736 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-447jw" event={"ID":"004d610c-a775-42bc-9a46-fa30fb50c03c","Type":"ContainerDied","Data":"c6f5c79d5d874381aba410ae91a91038c1ee154da651a7787c607fd98249e0df"} Oct 03 15:29:20 crc kubenswrapper[4774]: I1003 15:29:20.653593 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:29:20 crc kubenswrapper[4774]: I1003 15:29:20.654263 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:29:42 crc kubenswrapper[4774]: E1003 15:29:42.187070 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Oct 03 15:29:42 crc kubenswrapper[4774]: E1003 15:29:42.188612 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bqw5h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(7b63f9fa-2194-46e4-bfe0-d7efb33f10fb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 15:29:42 crc kubenswrapper[4774]: E1003 15:29:42.190210 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="7b63f9fa-2194-46e4-bfe0-d7efb33f10fb" Oct 03 15:29:42 crc kubenswrapper[4774]: I1003 15:29:42.635436 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-447jw" event={"ID":"004d610c-a775-42bc-9a46-fa30fb50c03c","Type":"ContainerStarted","Data":"4947aa112823965eac9597821c22b4d16233863f93b2bc75e6c62f6298f86ea1"} Oct 03 15:29:42 crc kubenswrapper[4774]: E1003 15:29:42.636838 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="7b63f9fa-2194-46e4-bfe0-d7efb33f10fb" Oct 03 15:29:42 crc kubenswrapper[4774]: I1003 15:29:42.659234 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-447jw" podStartSLOduration=2.877305227 podStartE2EDuration="29.659218518s" podCreationTimestamp="2025-10-03 15:29:13 +0000 UTC" firstStartedPulling="2025-10-03 15:29:15.35128467 +0000 UTC m=+2777.940488122" lastFinishedPulling="2025-10-03 15:29:42.133197961 +0000 UTC m=+2804.722401413" observedRunningTime="2025-10-03 15:29:42.65770256 +0000 UTC m=+2805.246906022" watchObservedRunningTime="2025-10-03 15:29:42.659218518 +0000 UTC m=+2805.248421970" Oct 03 15:29:43 crc kubenswrapper[4774]: I1003 15:29:43.844269 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-447jw" Oct 03 15:29:43 crc kubenswrapper[4774]: I1003 15:29:43.844344 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-447jw" Oct 03 15:29:44 crc kubenswrapper[4774]: I1003 15:29:44.889711 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-447jw" podUID="004d610c-a775-42bc-9a46-fa30fb50c03c" containerName="registry-server" probeResult="failure" output=< Oct 03 15:29:44 crc kubenswrapper[4774]: timeout: failed to connect service ":50051" within 1s Oct 03 15:29:44 crc kubenswrapper[4774]: > Oct 03 15:29:50 crc kubenswrapper[4774]: I1003 15:29:50.654027 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:29:50 crc kubenswrapper[4774]: I1003 15:29:50.654762 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:29:50 crc kubenswrapper[4774]: I1003 15:29:50.654812 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" Oct 03 15:29:50 crc kubenswrapper[4774]: I1003 15:29:50.655647 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"feff5047aec719d7bb7df138458f8ef5d1a9a2a9b949c0e51ed81eb70e458c8a"} pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 15:29:50 crc kubenswrapper[4774]: I1003 15:29:50.655711 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" containerID="cri-o://feff5047aec719d7bb7df138458f8ef5d1a9a2a9b949c0e51ed81eb70e458c8a" gracePeriod=600 Oct 03 15:29:51 crc kubenswrapper[4774]: I1003 15:29:51.728819 4774 generic.go:334] "Generic (PLEG): container finished" podID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerID="feff5047aec719d7bb7df138458f8ef5d1a9a2a9b949c0e51ed81eb70e458c8a" exitCode=0 Oct 03 15:29:51 crc kubenswrapper[4774]: I1003 15:29:51.729045 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" event={"ID":"ca37ac4b-f421-4198-a179-12901d36f0f5","Type":"ContainerDied","Data":"feff5047aec719d7bb7df138458f8ef5d1a9a2a9b949c0e51ed81eb70e458c8a"} Oct 03 15:29:51 crc kubenswrapper[4774]: I1003 15:29:51.729614 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" event={"ID":"ca37ac4b-f421-4198-a179-12901d36f0f5","Type":"ContainerStarted","Data":"cfea26b55e1d9b962f87ac922eb7772356d541492b505a24f7ff4d97f8d38ef7"} Oct 03 15:29:51 crc kubenswrapper[4774]: I1003 15:29:51.729656 4774 scope.go:117] "RemoveContainer" containerID="44e7bada9aed42aeacc03e1583cbd14ca821069273bbd22b595b5469bf0384a4" Oct 03 15:29:54 crc kubenswrapper[4774]: I1003 15:29:54.909506 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-447jw" podUID="004d610c-a775-42bc-9a46-fa30fb50c03c" containerName="registry-server" probeResult="failure" output=< Oct 03 15:29:54 crc kubenswrapper[4774]: timeout: failed to connect service ":50051" within 1s Oct 03 15:29:54 crc kubenswrapper[4774]: > Oct 03 15:29:58 crc kubenswrapper[4774]: I1003 15:29:58.801689 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb","Type":"ContainerStarted","Data":"723acbbb0d8164532a2cf396d95d9bc57eef75d84b4a4a86a082d9e9ca3ffb6b"} Oct 03 15:29:58 crc kubenswrapper[4774]: I1003 15:29:58.827757 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.763475061 podStartE2EDuration="52.827739219s" podCreationTimestamp="2025-10-03 15:29:06 +0000 UTC" firstStartedPulling="2025-10-03 15:29:08.396224944 +0000 UTC m=+2770.985428396" lastFinishedPulling="2025-10-03 15:29:57.460489102 +0000 UTC m=+2820.049692554" observedRunningTime="2025-10-03 15:29:58.821565567 +0000 UTC m=+2821.410769039" watchObservedRunningTime="2025-10-03 15:29:58.827739219 +0000 UTC m=+2821.416942671" Oct 03 15:30:00 crc kubenswrapper[4774]: I1003 15:30:00.149002 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325090-6mdvk"] Oct 03 15:30:00 crc kubenswrapper[4774]: I1003 15:30:00.153114 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325090-6mdvk" Oct 03 15:30:00 crc kubenswrapper[4774]: I1003 15:30:00.157095 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 15:30:00 crc kubenswrapper[4774]: I1003 15:30:00.158134 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 15:30:00 crc kubenswrapper[4774]: I1003 15:30:00.183104 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325090-6mdvk"] Oct 03 15:30:00 crc kubenswrapper[4774]: I1003 15:30:00.190268 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/459c496f-feb1-471f-b498-ea121d171926-config-volume\") pod \"collect-profiles-29325090-6mdvk\" (UID: \"459c496f-feb1-471f-b498-ea121d171926\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325090-6mdvk" Oct 03 15:30:00 crc kubenswrapper[4774]: I1003 15:30:00.190322 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtmnc\" (UniqueName: \"kubernetes.io/projected/459c496f-feb1-471f-b498-ea121d171926-kube-api-access-vtmnc\") pod \"collect-profiles-29325090-6mdvk\" (UID: \"459c496f-feb1-471f-b498-ea121d171926\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325090-6mdvk" Oct 03 15:30:00 crc kubenswrapper[4774]: I1003 15:30:00.190437 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/459c496f-feb1-471f-b498-ea121d171926-secret-volume\") pod \"collect-profiles-29325090-6mdvk\" (UID: \"459c496f-feb1-471f-b498-ea121d171926\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325090-6mdvk" Oct 03 15:30:00 crc kubenswrapper[4774]: I1003 15:30:00.293453 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/459c496f-feb1-471f-b498-ea121d171926-config-volume\") pod \"collect-profiles-29325090-6mdvk\" (UID: \"459c496f-feb1-471f-b498-ea121d171926\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325090-6mdvk" Oct 03 15:30:00 crc kubenswrapper[4774]: I1003 15:30:00.293580 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtmnc\" (UniqueName: \"kubernetes.io/projected/459c496f-feb1-471f-b498-ea121d171926-kube-api-access-vtmnc\") pod \"collect-profiles-29325090-6mdvk\" (UID: \"459c496f-feb1-471f-b498-ea121d171926\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325090-6mdvk" Oct 03 15:30:00 crc kubenswrapper[4774]: I1003 15:30:00.293754 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/459c496f-feb1-471f-b498-ea121d171926-secret-volume\") pod \"collect-profiles-29325090-6mdvk\" (UID: \"459c496f-feb1-471f-b498-ea121d171926\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325090-6mdvk" Oct 03 15:30:00 crc kubenswrapper[4774]: I1003 15:30:00.294460 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/459c496f-feb1-471f-b498-ea121d171926-config-volume\") pod \"collect-profiles-29325090-6mdvk\" (UID: \"459c496f-feb1-471f-b498-ea121d171926\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325090-6mdvk" Oct 03 15:30:00 crc kubenswrapper[4774]: I1003 15:30:00.309942 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/459c496f-feb1-471f-b498-ea121d171926-secret-volume\") pod \"collect-profiles-29325090-6mdvk\" (UID: \"459c496f-feb1-471f-b498-ea121d171926\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325090-6mdvk" Oct 03 15:30:00 crc kubenswrapper[4774]: I1003 15:30:00.321006 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtmnc\" (UniqueName: \"kubernetes.io/projected/459c496f-feb1-471f-b498-ea121d171926-kube-api-access-vtmnc\") pod \"collect-profiles-29325090-6mdvk\" (UID: \"459c496f-feb1-471f-b498-ea121d171926\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325090-6mdvk" Oct 03 15:30:00 crc kubenswrapper[4774]: I1003 15:30:00.492969 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325090-6mdvk" Oct 03 15:30:00 crc kubenswrapper[4774]: I1003 15:30:00.938895 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325090-6mdvk"] Oct 03 15:30:01 crc kubenswrapper[4774]: I1003 15:30:01.831621 4774 generic.go:334] "Generic (PLEG): container finished" podID="459c496f-feb1-471f-b498-ea121d171926" containerID="976fb870a0af091fe5004b7ee1dfe7d9703336616ab7243fcecd4474b3c59697" exitCode=0 Oct 03 15:30:01 crc kubenswrapper[4774]: I1003 15:30:01.831723 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325090-6mdvk" event={"ID":"459c496f-feb1-471f-b498-ea121d171926","Type":"ContainerDied","Data":"976fb870a0af091fe5004b7ee1dfe7d9703336616ab7243fcecd4474b3c59697"} Oct 03 15:30:01 crc kubenswrapper[4774]: I1003 15:30:01.832115 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325090-6mdvk" event={"ID":"459c496f-feb1-471f-b498-ea121d171926","Type":"ContainerStarted","Data":"6d3e4c460e6fa3750c07a066b85742e12953a12e8e3163f3755f66c55f9845f2"} Oct 03 15:30:03 crc kubenswrapper[4774]: I1003 15:30:03.243907 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325090-6mdvk" Oct 03 15:30:03 crc kubenswrapper[4774]: I1003 15:30:03.247804 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/459c496f-feb1-471f-b498-ea121d171926-secret-volume\") pod \"459c496f-feb1-471f-b498-ea121d171926\" (UID: \"459c496f-feb1-471f-b498-ea121d171926\") " Oct 03 15:30:03 crc kubenswrapper[4774]: I1003 15:30:03.247857 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtmnc\" (UniqueName: \"kubernetes.io/projected/459c496f-feb1-471f-b498-ea121d171926-kube-api-access-vtmnc\") pod \"459c496f-feb1-471f-b498-ea121d171926\" (UID: \"459c496f-feb1-471f-b498-ea121d171926\") " Oct 03 15:30:03 crc kubenswrapper[4774]: I1003 15:30:03.247959 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/459c496f-feb1-471f-b498-ea121d171926-config-volume\") pod \"459c496f-feb1-471f-b498-ea121d171926\" (UID: \"459c496f-feb1-471f-b498-ea121d171926\") " Oct 03 15:30:03 crc kubenswrapper[4774]: I1003 15:30:03.248695 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/459c496f-feb1-471f-b498-ea121d171926-config-volume" (OuterVolumeSpecName: "config-volume") pod "459c496f-feb1-471f-b498-ea121d171926" (UID: "459c496f-feb1-471f-b498-ea121d171926"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:30:03 crc kubenswrapper[4774]: I1003 15:30:03.252330 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/459c496f-feb1-471f-b498-ea121d171926-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "459c496f-feb1-471f-b498-ea121d171926" (UID: "459c496f-feb1-471f-b498-ea121d171926"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:30:03 crc kubenswrapper[4774]: I1003 15:30:03.260811 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/459c496f-feb1-471f-b498-ea121d171926-kube-api-access-vtmnc" (OuterVolumeSpecName: "kube-api-access-vtmnc") pod "459c496f-feb1-471f-b498-ea121d171926" (UID: "459c496f-feb1-471f-b498-ea121d171926"). InnerVolumeSpecName "kube-api-access-vtmnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:30:03 crc kubenswrapper[4774]: I1003 15:30:03.350328 4774 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/459c496f-feb1-471f-b498-ea121d171926-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 15:30:03 crc kubenswrapper[4774]: I1003 15:30:03.350593 4774 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/459c496f-feb1-471f-b498-ea121d171926-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 15:30:03 crc kubenswrapper[4774]: I1003 15:30:03.350690 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtmnc\" (UniqueName: \"kubernetes.io/projected/459c496f-feb1-471f-b498-ea121d171926-kube-api-access-vtmnc\") on node \"crc\" DevicePath \"\"" Oct 03 15:30:03 crc kubenswrapper[4774]: I1003 15:30:03.852933 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325090-6mdvk" event={"ID":"459c496f-feb1-471f-b498-ea121d171926","Type":"ContainerDied","Data":"6d3e4c460e6fa3750c07a066b85742e12953a12e8e3163f3755f66c55f9845f2"} Oct 03 15:30:03 crc kubenswrapper[4774]: I1003 15:30:03.852984 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d3e4c460e6fa3750c07a066b85742e12953a12e8e3163f3755f66c55f9845f2" Oct 03 15:30:03 crc kubenswrapper[4774]: I1003 15:30:03.853326 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325090-6mdvk" Oct 03 15:30:03 crc kubenswrapper[4774]: I1003 15:30:03.933908 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-447jw" Oct 03 15:30:04 crc kubenswrapper[4774]: I1003 15:30:04.003347 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-447jw" Oct 03 15:30:04 crc kubenswrapper[4774]: I1003 15:30:04.179208 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-447jw"] Oct 03 15:30:04 crc kubenswrapper[4774]: I1003 15:30:04.359748 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325045-8kfbc"] Oct 03 15:30:04 crc kubenswrapper[4774]: I1003 15:30:04.370799 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325045-8kfbc"] Oct 03 15:30:05 crc kubenswrapper[4774]: I1003 15:30:05.315188 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9" path="/var/lib/kubelet/pods/951d01b4-ac65-4b14-afa6-6e2bcb7fbdb9/volumes" Oct 03 15:30:05 crc kubenswrapper[4774]: I1003 15:30:05.868616 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-447jw" podUID="004d610c-a775-42bc-9a46-fa30fb50c03c" containerName="registry-server" containerID="cri-o://4947aa112823965eac9597821c22b4d16233863f93b2bc75e6c62f6298f86ea1" gracePeriod=2 Oct 03 15:30:06 crc kubenswrapper[4774]: I1003 15:30:06.311283 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-447jw" Oct 03 15:30:06 crc kubenswrapper[4774]: I1003 15:30:06.509525 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/004d610c-a775-42bc-9a46-fa30fb50c03c-utilities\") pod \"004d610c-a775-42bc-9a46-fa30fb50c03c\" (UID: \"004d610c-a775-42bc-9a46-fa30fb50c03c\") " Oct 03 15:30:06 crc kubenswrapper[4774]: I1003 15:30:06.509711 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/004d610c-a775-42bc-9a46-fa30fb50c03c-catalog-content\") pod \"004d610c-a775-42bc-9a46-fa30fb50c03c\" (UID: \"004d610c-a775-42bc-9a46-fa30fb50c03c\") " Oct 03 15:30:06 crc kubenswrapper[4774]: I1003 15:30:06.509773 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76f5h\" (UniqueName: \"kubernetes.io/projected/004d610c-a775-42bc-9a46-fa30fb50c03c-kube-api-access-76f5h\") pod \"004d610c-a775-42bc-9a46-fa30fb50c03c\" (UID: \"004d610c-a775-42bc-9a46-fa30fb50c03c\") " Oct 03 15:30:06 crc kubenswrapper[4774]: I1003 15:30:06.510589 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/004d610c-a775-42bc-9a46-fa30fb50c03c-utilities" (OuterVolumeSpecName: "utilities") pod "004d610c-a775-42bc-9a46-fa30fb50c03c" (UID: "004d610c-a775-42bc-9a46-fa30fb50c03c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:30:06 crc kubenswrapper[4774]: I1003 15:30:06.515801 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/004d610c-a775-42bc-9a46-fa30fb50c03c-kube-api-access-76f5h" (OuterVolumeSpecName: "kube-api-access-76f5h") pod "004d610c-a775-42bc-9a46-fa30fb50c03c" (UID: "004d610c-a775-42bc-9a46-fa30fb50c03c"). InnerVolumeSpecName "kube-api-access-76f5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:30:06 crc kubenswrapper[4774]: I1003 15:30:06.556390 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/004d610c-a775-42bc-9a46-fa30fb50c03c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "004d610c-a775-42bc-9a46-fa30fb50c03c" (UID: "004d610c-a775-42bc-9a46-fa30fb50c03c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:30:06 crc kubenswrapper[4774]: I1003 15:30:06.611993 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/004d610c-a775-42bc-9a46-fa30fb50c03c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 15:30:06 crc kubenswrapper[4774]: I1003 15:30:06.612028 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76f5h\" (UniqueName: \"kubernetes.io/projected/004d610c-a775-42bc-9a46-fa30fb50c03c-kube-api-access-76f5h\") on node \"crc\" DevicePath \"\"" Oct 03 15:30:06 crc kubenswrapper[4774]: I1003 15:30:06.612046 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/004d610c-a775-42bc-9a46-fa30fb50c03c-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 15:30:06 crc kubenswrapper[4774]: I1003 15:30:06.879114 4774 generic.go:334] "Generic (PLEG): container finished" podID="004d610c-a775-42bc-9a46-fa30fb50c03c" containerID="4947aa112823965eac9597821c22b4d16233863f93b2bc75e6c62f6298f86ea1" exitCode=0 Oct 03 15:30:06 crc kubenswrapper[4774]: I1003 15:30:06.879155 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-447jw" event={"ID":"004d610c-a775-42bc-9a46-fa30fb50c03c","Type":"ContainerDied","Data":"4947aa112823965eac9597821c22b4d16233863f93b2bc75e6c62f6298f86ea1"} Oct 03 15:30:06 crc kubenswrapper[4774]: I1003 15:30:06.879182 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-447jw" event={"ID":"004d610c-a775-42bc-9a46-fa30fb50c03c","Type":"ContainerDied","Data":"d03419a9f231ed008b50add281eba463414e29e45308aac32b2919049b8c56d3"} Oct 03 15:30:06 crc kubenswrapper[4774]: I1003 15:30:06.879198 4774 scope.go:117] "RemoveContainer" containerID="4947aa112823965eac9597821c22b4d16233863f93b2bc75e6c62f6298f86ea1" Oct 03 15:30:06 crc kubenswrapper[4774]: I1003 15:30:06.879208 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-447jw" Oct 03 15:30:06 crc kubenswrapper[4774]: I1003 15:30:06.914928 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-447jw"] Oct 03 15:30:06 crc kubenswrapper[4774]: I1003 15:30:06.917450 4774 scope.go:117] "RemoveContainer" containerID="c6f5c79d5d874381aba410ae91a91038c1ee154da651a7787c607fd98249e0df" Oct 03 15:30:06 crc kubenswrapper[4774]: I1003 15:30:06.923902 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-447jw"] Oct 03 15:30:06 crc kubenswrapper[4774]: I1003 15:30:06.949588 4774 scope.go:117] "RemoveContainer" containerID="cdedc8ea767656dcfbbc9999bff6e904b6f691db68545ca51bd256e5b80abc2d" Oct 03 15:30:06 crc kubenswrapper[4774]: I1003 15:30:06.996073 4774 scope.go:117] "RemoveContainer" containerID="4947aa112823965eac9597821c22b4d16233863f93b2bc75e6c62f6298f86ea1" Oct 03 15:30:06 crc kubenswrapper[4774]: E1003 15:30:06.996637 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4947aa112823965eac9597821c22b4d16233863f93b2bc75e6c62f6298f86ea1\": container with ID starting with 4947aa112823965eac9597821c22b4d16233863f93b2bc75e6c62f6298f86ea1 not found: ID does not exist" containerID="4947aa112823965eac9597821c22b4d16233863f93b2bc75e6c62f6298f86ea1" Oct 03 15:30:06 crc kubenswrapper[4774]: I1003 15:30:06.996680 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4947aa112823965eac9597821c22b4d16233863f93b2bc75e6c62f6298f86ea1"} err="failed to get container status \"4947aa112823965eac9597821c22b4d16233863f93b2bc75e6c62f6298f86ea1\": rpc error: code = NotFound desc = could not find container \"4947aa112823965eac9597821c22b4d16233863f93b2bc75e6c62f6298f86ea1\": container with ID starting with 4947aa112823965eac9597821c22b4d16233863f93b2bc75e6c62f6298f86ea1 not found: ID does not exist" Oct 03 15:30:06 crc kubenswrapper[4774]: I1003 15:30:06.996707 4774 scope.go:117] "RemoveContainer" containerID="c6f5c79d5d874381aba410ae91a91038c1ee154da651a7787c607fd98249e0df" Oct 03 15:30:06 crc kubenswrapper[4774]: E1003 15:30:06.997087 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6f5c79d5d874381aba410ae91a91038c1ee154da651a7787c607fd98249e0df\": container with ID starting with c6f5c79d5d874381aba410ae91a91038c1ee154da651a7787c607fd98249e0df not found: ID does not exist" containerID="c6f5c79d5d874381aba410ae91a91038c1ee154da651a7787c607fd98249e0df" Oct 03 15:30:06 crc kubenswrapper[4774]: I1003 15:30:06.997122 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6f5c79d5d874381aba410ae91a91038c1ee154da651a7787c607fd98249e0df"} err="failed to get container status \"c6f5c79d5d874381aba410ae91a91038c1ee154da651a7787c607fd98249e0df\": rpc error: code = NotFound desc = could not find container \"c6f5c79d5d874381aba410ae91a91038c1ee154da651a7787c607fd98249e0df\": container with ID starting with c6f5c79d5d874381aba410ae91a91038c1ee154da651a7787c607fd98249e0df not found: ID does not exist" Oct 03 15:30:06 crc kubenswrapper[4774]: I1003 15:30:06.997147 4774 scope.go:117] "RemoveContainer" containerID="cdedc8ea767656dcfbbc9999bff6e904b6f691db68545ca51bd256e5b80abc2d" Oct 03 15:30:06 crc kubenswrapper[4774]: E1003 15:30:06.997642 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdedc8ea767656dcfbbc9999bff6e904b6f691db68545ca51bd256e5b80abc2d\": container with ID starting with cdedc8ea767656dcfbbc9999bff6e904b6f691db68545ca51bd256e5b80abc2d not found: ID does not exist" containerID="cdedc8ea767656dcfbbc9999bff6e904b6f691db68545ca51bd256e5b80abc2d" Oct 03 15:30:06 crc kubenswrapper[4774]: I1003 15:30:06.997738 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdedc8ea767656dcfbbc9999bff6e904b6f691db68545ca51bd256e5b80abc2d"} err="failed to get container status \"cdedc8ea767656dcfbbc9999bff6e904b6f691db68545ca51bd256e5b80abc2d\": rpc error: code = NotFound desc = could not find container \"cdedc8ea767656dcfbbc9999bff6e904b6f691db68545ca51bd256e5b80abc2d\": container with ID starting with cdedc8ea767656dcfbbc9999bff6e904b6f691db68545ca51bd256e5b80abc2d not found: ID does not exist" Oct 03 15:30:07 crc kubenswrapper[4774]: I1003 15:30:07.310983 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="004d610c-a775-42bc-9a46-fa30fb50c03c" path="/var/lib/kubelet/pods/004d610c-a775-42bc-9a46-fa30fb50c03c/volumes" Oct 03 15:30:26 crc kubenswrapper[4774]: I1003 15:30:26.818003 4774 scope.go:117] "RemoveContainer" containerID="1a02b436e27ea94f0d0c69bf07983b638385eb7dc87c0284146192a5e0740503" Oct 03 15:30:31 crc kubenswrapper[4774]: I1003 15:30:31.329283 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-scgxw"] Oct 03 15:30:31 crc kubenswrapper[4774]: E1003 15:30:31.330255 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="459c496f-feb1-471f-b498-ea121d171926" containerName="collect-profiles" Oct 03 15:30:31 crc kubenswrapper[4774]: I1003 15:30:31.330268 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="459c496f-feb1-471f-b498-ea121d171926" containerName="collect-profiles" Oct 03 15:30:31 crc kubenswrapper[4774]: E1003 15:30:31.330278 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="004d610c-a775-42bc-9a46-fa30fb50c03c" containerName="registry-server" Oct 03 15:30:31 crc kubenswrapper[4774]: I1003 15:30:31.330284 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="004d610c-a775-42bc-9a46-fa30fb50c03c" containerName="registry-server" Oct 03 15:30:31 crc kubenswrapper[4774]: E1003 15:30:31.330306 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="004d610c-a775-42bc-9a46-fa30fb50c03c" containerName="extract-content" Oct 03 15:30:31 crc kubenswrapper[4774]: I1003 15:30:31.330311 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="004d610c-a775-42bc-9a46-fa30fb50c03c" containerName="extract-content" Oct 03 15:30:31 crc kubenswrapper[4774]: E1003 15:30:31.330322 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="004d610c-a775-42bc-9a46-fa30fb50c03c" containerName="extract-utilities" Oct 03 15:30:31 crc kubenswrapper[4774]: I1003 15:30:31.330328 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="004d610c-a775-42bc-9a46-fa30fb50c03c" containerName="extract-utilities" Oct 03 15:30:31 crc kubenswrapper[4774]: I1003 15:30:31.330524 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="459c496f-feb1-471f-b498-ea121d171926" containerName="collect-profiles" Oct 03 15:30:31 crc kubenswrapper[4774]: I1003 15:30:31.330546 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="004d610c-a775-42bc-9a46-fa30fb50c03c" containerName="registry-server" Oct 03 15:30:31 crc kubenswrapper[4774]: I1003 15:30:31.331749 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-scgxw"] Oct 03 15:30:31 crc kubenswrapper[4774]: I1003 15:30:31.331830 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-scgxw" Oct 03 15:30:31 crc kubenswrapper[4774]: I1003 15:30:31.504605 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9938b350-5406-4a55-9da7-5b084da44eb0-utilities\") pod \"redhat-marketplace-scgxw\" (UID: \"9938b350-5406-4a55-9da7-5b084da44eb0\") " pod="openshift-marketplace/redhat-marketplace-scgxw" Oct 03 15:30:31 crc kubenswrapper[4774]: I1003 15:30:31.504679 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9938b350-5406-4a55-9da7-5b084da44eb0-catalog-content\") pod \"redhat-marketplace-scgxw\" (UID: \"9938b350-5406-4a55-9da7-5b084da44eb0\") " pod="openshift-marketplace/redhat-marketplace-scgxw" Oct 03 15:30:31 crc kubenswrapper[4774]: I1003 15:30:31.504989 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwtdp\" (UniqueName: \"kubernetes.io/projected/9938b350-5406-4a55-9da7-5b084da44eb0-kube-api-access-pwtdp\") pod \"redhat-marketplace-scgxw\" (UID: \"9938b350-5406-4a55-9da7-5b084da44eb0\") " pod="openshift-marketplace/redhat-marketplace-scgxw" Oct 03 15:30:31 crc kubenswrapper[4774]: I1003 15:30:31.606232 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwtdp\" (UniqueName: \"kubernetes.io/projected/9938b350-5406-4a55-9da7-5b084da44eb0-kube-api-access-pwtdp\") pod \"redhat-marketplace-scgxw\" (UID: \"9938b350-5406-4a55-9da7-5b084da44eb0\") " pod="openshift-marketplace/redhat-marketplace-scgxw" Oct 03 15:30:31 crc kubenswrapper[4774]: I1003 15:30:31.606308 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9938b350-5406-4a55-9da7-5b084da44eb0-utilities\") pod \"redhat-marketplace-scgxw\" (UID: \"9938b350-5406-4a55-9da7-5b084da44eb0\") " pod="openshift-marketplace/redhat-marketplace-scgxw" Oct 03 15:30:31 crc kubenswrapper[4774]: I1003 15:30:31.606355 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9938b350-5406-4a55-9da7-5b084da44eb0-catalog-content\") pod \"redhat-marketplace-scgxw\" (UID: \"9938b350-5406-4a55-9da7-5b084da44eb0\") " pod="openshift-marketplace/redhat-marketplace-scgxw" Oct 03 15:30:31 crc kubenswrapper[4774]: I1003 15:30:31.606948 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9938b350-5406-4a55-9da7-5b084da44eb0-catalog-content\") pod \"redhat-marketplace-scgxw\" (UID: \"9938b350-5406-4a55-9da7-5b084da44eb0\") " pod="openshift-marketplace/redhat-marketplace-scgxw" Oct 03 15:30:31 crc kubenswrapper[4774]: I1003 15:30:31.607617 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9938b350-5406-4a55-9da7-5b084da44eb0-utilities\") pod \"redhat-marketplace-scgxw\" (UID: \"9938b350-5406-4a55-9da7-5b084da44eb0\") " pod="openshift-marketplace/redhat-marketplace-scgxw" Oct 03 15:30:31 crc kubenswrapper[4774]: I1003 15:30:31.628354 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwtdp\" (UniqueName: \"kubernetes.io/projected/9938b350-5406-4a55-9da7-5b084da44eb0-kube-api-access-pwtdp\") pod \"redhat-marketplace-scgxw\" (UID: \"9938b350-5406-4a55-9da7-5b084da44eb0\") " pod="openshift-marketplace/redhat-marketplace-scgxw" Oct 03 15:30:31 crc kubenswrapper[4774]: I1003 15:30:31.664172 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-scgxw" Oct 03 15:30:32 crc kubenswrapper[4774]: I1003 15:30:32.127007 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-scgxw"] Oct 03 15:30:32 crc kubenswrapper[4774]: I1003 15:30:32.174485 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-scgxw" event={"ID":"9938b350-5406-4a55-9da7-5b084da44eb0","Type":"ContainerStarted","Data":"481c63b7a8f185f50b812bf6cfb4a94a53ac506ebdd0522a7838429a9141747e"} Oct 03 15:30:33 crc kubenswrapper[4774]: I1003 15:30:33.190525 4774 generic.go:334] "Generic (PLEG): container finished" podID="9938b350-5406-4a55-9da7-5b084da44eb0" containerID="b9c42ca712ff25434a1165809d32962b20226f56a6f960a8b14cde9886fe9d8b" exitCode=0 Oct 03 15:30:33 crc kubenswrapper[4774]: I1003 15:30:33.190829 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-scgxw" event={"ID":"9938b350-5406-4a55-9da7-5b084da44eb0","Type":"ContainerDied","Data":"b9c42ca712ff25434a1165809d32962b20226f56a6f960a8b14cde9886fe9d8b"} Oct 03 15:30:35 crc kubenswrapper[4774]: I1003 15:30:35.212971 4774 generic.go:334] "Generic (PLEG): container finished" podID="9938b350-5406-4a55-9da7-5b084da44eb0" containerID="b879468479ae962372779a668f772570465887c9f576a87b291edec7a6be0f08" exitCode=0 Oct 03 15:30:35 crc kubenswrapper[4774]: I1003 15:30:35.213248 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-scgxw" event={"ID":"9938b350-5406-4a55-9da7-5b084da44eb0","Type":"ContainerDied","Data":"b879468479ae962372779a668f772570465887c9f576a87b291edec7a6be0f08"} Oct 03 15:30:36 crc kubenswrapper[4774]: I1003 15:30:36.224243 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-scgxw" event={"ID":"9938b350-5406-4a55-9da7-5b084da44eb0","Type":"ContainerStarted","Data":"4a29dfc4b7da2deda5d024ee44e04820dad2d4034019a688b5bb8e079cc5e8d0"} Oct 03 15:30:36 crc kubenswrapper[4774]: I1003 15:30:36.250951 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-scgxw" podStartSLOduration=2.725772456 podStartE2EDuration="5.250925432s" podCreationTimestamp="2025-10-03 15:30:31 +0000 UTC" firstStartedPulling="2025-10-03 15:30:33.192725907 +0000 UTC m=+2855.781929359" lastFinishedPulling="2025-10-03 15:30:35.717878883 +0000 UTC m=+2858.307082335" observedRunningTime="2025-10-03 15:30:36.241893209 +0000 UTC m=+2858.831096671" watchObservedRunningTime="2025-10-03 15:30:36.250925432 +0000 UTC m=+2858.840128884" Oct 03 15:30:41 crc kubenswrapper[4774]: I1003 15:30:41.665341 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-scgxw" Oct 03 15:30:41 crc kubenswrapper[4774]: I1003 15:30:41.665801 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-scgxw" Oct 03 15:30:41 crc kubenswrapper[4774]: I1003 15:30:41.721568 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-scgxw" Oct 03 15:30:42 crc kubenswrapper[4774]: I1003 15:30:42.347459 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-scgxw" Oct 03 15:30:42 crc kubenswrapper[4774]: I1003 15:30:42.399759 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-scgxw"] Oct 03 15:30:44 crc kubenswrapper[4774]: I1003 15:30:44.316570 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-scgxw" podUID="9938b350-5406-4a55-9da7-5b084da44eb0" containerName="registry-server" containerID="cri-o://4a29dfc4b7da2deda5d024ee44e04820dad2d4034019a688b5bb8e079cc5e8d0" gracePeriod=2 Oct 03 15:30:44 crc kubenswrapper[4774]: I1003 15:30:44.773326 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-scgxw" Oct 03 15:30:44 crc kubenswrapper[4774]: I1003 15:30:44.863565 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9938b350-5406-4a55-9da7-5b084da44eb0-catalog-content\") pod \"9938b350-5406-4a55-9da7-5b084da44eb0\" (UID: \"9938b350-5406-4a55-9da7-5b084da44eb0\") " Oct 03 15:30:44 crc kubenswrapper[4774]: I1003 15:30:44.863846 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9938b350-5406-4a55-9da7-5b084da44eb0-utilities\") pod \"9938b350-5406-4a55-9da7-5b084da44eb0\" (UID: \"9938b350-5406-4a55-9da7-5b084da44eb0\") " Oct 03 15:30:44 crc kubenswrapper[4774]: I1003 15:30:44.863967 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwtdp\" (UniqueName: \"kubernetes.io/projected/9938b350-5406-4a55-9da7-5b084da44eb0-kube-api-access-pwtdp\") pod \"9938b350-5406-4a55-9da7-5b084da44eb0\" (UID: \"9938b350-5406-4a55-9da7-5b084da44eb0\") " Oct 03 15:30:44 crc kubenswrapper[4774]: I1003 15:30:44.864597 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9938b350-5406-4a55-9da7-5b084da44eb0-utilities" (OuterVolumeSpecName: "utilities") pod "9938b350-5406-4a55-9da7-5b084da44eb0" (UID: "9938b350-5406-4a55-9da7-5b084da44eb0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:30:44 crc kubenswrapper[4774]: I1003 15:30:44.873599 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9938b350-5406-4a55-9da7-5b084da44eb0-kube-api-access-pwtdp" (OuterVolumeSpecName: "kube-api-access-pwtdp") pod "9938b350-5406-4a55-9da7-5b084da44eb0" (UID: "9938b350-5406-4a55-9da7-5b084da44eb0"). InnerVolumeSpecName "kube-api-access-pwtdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:30:44 crc kubenswrapper[4774]: I1003 15:30:44.884327 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9938b350-5406-4a55-9da7-5b084da44eb0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9938b350-5406-4a55-9da7-5b084da44eb0" (UID: "9938b350-5406-4a55-9da7-5b084da44eb0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:30:44 crc kubenswrapper[4774]: I1003 15:30:44.966569 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9938b350-5406-4a55-9da7-5b084da44eb0-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 15:30:44 crc kubenswrapper[4774]: I1003 15:30:44.966601 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwtdp\" (UniqueName: \"kubernetes.io/projected/9938b350-5406-4a55-9da7-5b084da44eb0-kube-api-access-pwtdp\") on node \"crc\" DevicePath \"\"" Oct 03 15:30:44 crc kubenswrapper[4774]: I1003 15:30:44.966613 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9938b350-5406-4a55-9da7-5b084da44eb0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 15:30:45 crc kubenswrapper[4774]: I1003 15:30:45.328300 4774 generic.go:334] "Generic (PLEG): container finished" podID="9938b350-5406-4a55-9da7-5b084da44eb0" containerID="4a29dfc4b7da2deda5d024ee44e04820dad2d4034019a688b5bb8e079cc5e8d0" exitCode=0 Oct 03 15:30:45 crc kubenswrapper[4774]: I1003 15:30:45.328346 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-scgxw" Oct 03 15:30:45 crc kubenswrapper[4774]: I1003 15:30:45.328347 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-scgxw" event={"ID":"9938b350-5406-4a55-9da7-5b084da44eb0","Type":"ContainerDied","Data":"4a29dfc4b7da2deda5d024ee44e04820dad2d4034019a688b5bb8e079cc5e8d0"} Oct 03 15:30:45 crc kubenswrapper[4774]: I1003 15:30:45.328602 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-scgxw" event={"ID":"9938b350-5406-4a55-9da7-5b084da44eb0","Type":"ContainerDied","Data":"481c63b7a8f185f50b812bf6cfb4a94a53ac506ebdd0522a7838429a9141747e"} Oct 03 15:30:45 crc kubenswrapper[4774]: I1003 15:30:45.328648 4774 scope.go:117] "RemoveContainer" containerID="4a29dfc4b7da2deda5d024ee44e04820dad2d4034019a688b5bb8e079cc5e8d0" Oct 03 15:30:45 crc kubenswrapper[4774]: I1003 15:30:45.361330 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-scgxw"] Oct 03 15:30:45 crc kubenswrapper[4774]: I1003 15:30:45.367070 4774 scope.go:117] "RemoveContainer" containerID="b879468479ae962372779a668f772570465887c9f576a87b291edec7a6be0f08" Oct 03 15:30:45 crc kubenswrapper[4774]: I1003 15:30:45.374058 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-scgxw"] Oct 03 15:30:45 crc kubenswrapper[4774]: I1003 15:30:45.392619 4774 scope.go:117] "RemoveContainer" containerID="b9c42ca712ff25434a1165809d32962b20226f56a6f960a8b14cde9886fe9d8b" Oct 03 15:30:45 crc kubenswrapper[4774]: I1003 15:30:45.454029 4774 scope.go:117] "RemoveContainer" containerID="4a29dfc4b7da2deda5d024ee44e04820dad2d4034019a688b5bb8e079cc5e8d0" Oct 03 15:30:45 crc kubenswrapper[4774]: E1003 15:30:45.454600 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a29dfc4b7da2deda5d024ee44e04820dad2d4034019a688b5bb8e079cc5e8d0\": container with ID starting with 4a29dfc4b7da2deda5d024ee44e04820dad2d4034019a688b5bb8e079cc5e8d0 not found: ID does not exist" containerID="4a29dfc4b7da2deda5d024ee44e04820dad2d4034019a688b5bb8e079cc5e8d0" Oct 03 15:30:45 crc kubenswrapper[4774]: I1003 15:30:45.454649 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a29dfc4b7da2deda5d024ee44e04820dad2d4034019a688b5bb8e079cc5e8d0"} err="failed to get container status \"4a29dfc4b7da2deda5d024ee44e04820dad2d4034019a688b5bb8e079cc5e8d0\": rpc error: code = NotFound desc = could not find container \"4a29dfc4b7da2deda5d024ee44e04820dad2d4034019a688b5bb8e079cc5e8d0\": container with ID starting with 4a29dfc4b7da2deda5d024ee44e04820dad2d4034019a688b5bb8e079cc5e8d0 not found: ID does not exist" Oct 03 15:30:45 crc kubenswrapper[4774]: I1003 15:30:45.454681 4774 scope.go:117] "RemoveContainer" containerID="b879468479ae962372779a668f772570465887c9f576a87b291edec7a6be0f08" Oct 03 15:30:45 crc kubenswrapper[4774]: E1003 15:30:45.454944 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b879468479ae962372779a668f772570465887c9f576a87b291edec7a6be0f08\": container with ID starting with b879468479ae962372779a668f772570465887c9f576a87b291edec7a6be0f08 not found: ID does not exist" containerID="b879468479ae962372779a668f772570465887c9f576a87b291edec7a6be0f08" Oct 03 15:30:45 crc kubenswrapper[4774]: I1003 15:30:45.454975 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b879468479ae962372779a668f772570465887c9f576a87b291edec7a6be0f08"} err="failed to get container status \"b879468479ae962372779a668f772570465887c9f576a87b291edec7a6be0f08\": rpc error: code = NotFound desc = could not find container \"b879468479ae962372779a668f772570465887c9f576a87b291edec7a6be0f08\": container with ID starting with b879468479ae962372779a668f772570465887c9f576a87b291edec7a6be0f08 not found: ID does not exist" Oct 03 15:30:45 crc kubenswrapper[4774]: I1003 15:30:45.454990 4774 scope.go:117] "RemoveContainer" containerID="b9c42ca712ff25434a1165809d32962b20226f56a6f960a8b14cde9886fe9d8b" Oct 03 15:30:45 crc kubenswrapper[4774]: E1003 15:30:45.455188 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9c42ca712ff25434a1165809d32962b20226f56a6f960a8b14cde9886fe9d8b\": container with ID starting with b9c42ca712ff25434a1165809d32962b20226f56a6f960a8b14cde9886fe9d8b not found: ID does not exist" containerID="b9c42ca712ff25434a1165809d32962b20226f56a6f960a8b14cde9886fe9d8b" Oct 03 15:30:45 crc kubenswrapper[4774]: I1003 15:30:45.455209 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9c42ca712ff25434a1165809d32962b20226f56a6f960a8b14cde9886fe9d8b"} err="failed to get container status \"b9c42ca712ff25434a1165809d32962b20226f56a6f960a8b14cde9886fe9d8b\": rpc error: code = NotFound desc = could not find container \"b9c42ca712ff25434a1165809d32962b20226f56a6f960a8b14cde9886fe9d8b\": container with ID starting with b9c42ca712ff25434a1165809d32962b20226f56a6f960a8b14cde9886fe9d8b not found: ID does not exist" Oct 03 15:30:47 crc kubenswrapper[4774]: I1003 15:30:47.313015 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9938b350-5406-4a55-9da7-5b084da44eb0" path="/var/lib/kubelet/pods/9938b350-5406-4a55-9da7-5b084da44eb0/volumes" Oct 03 15:32:20 crc kubenswrapper[4774]: I1003 15:32:20.654134 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:32:20 crc kubenswrapper[4774]: I1003 15:32:20.654981 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:32:50 crc kubenswrapper[4774]: I1003 15:32:50.653502 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:32:50 crc kubenswrapper[4774]: I1003 15:32:50.654178 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:33:20 crc kubenswrapper[4774]: I1003 15:33:20.654035 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:33:20 crc kubenswrapper[4774]: I1003 15:33:20.654855 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:33:20 crc kubenswrapper[4774]: I1003 15:33:20.654929 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" Oct 03 15:33:20 crc kubenswrapper[4774]: I1003 15:33:20.656181 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cfea26b55e1d9b962f87ac922eb7772356d541492b505a24f7ff4d97f8d38ef7"} pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 15:33:20 crc kubenswrapper[4774]: I1003 15:33:20.656292 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" containerID="cri-o://cfea26b55e1d9b962f87ac922eb7772356d541492b505a24f7ff4d97f8d38ef7" gracePeriod=600 Oct 03 15:33:20 crc kubenswrapper[4774]: E1003 15:33:20.781329 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:33:21 crc kubenswrapper[4774]: I1003 15:33:21.004827 4774 generic.go:334] "Generic (PLEG): container finished" podID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerID="cfea26b55e1d9b962f87ac922eb7772356d541492b505a24f7ff4d97f8d38ef7" exitCode=0 Oct 03 15:33:21 crc kubenswrapper[4774]: I1003 15:33:21.005001 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" event={"ID":"ca37ac4b-f421-4198-a179-12901d36f0f5","Type":"ContainerDied","Data":"cfea26b55e1d9b962f87ac922eb7772356d541492b505a24f7ff4d97f8d38ef7"} Oct 03 15:33:21 crc kubenswrapper[4774]: I1003 15:33:21.005484 4774 scope.go:117] "RemoveContainer" containerID="feff5047aec719d7bb7df138458f8ef5d1a9a2a9b949c0e51ed81eb70e458c8a" Oct 03 15:33:21 crc kubenswrapper[4774]: I1003 15:33:21.008134 4774 scope.go:117] "RemoveContainer" containerID="cfea26b55e1d9b962f87ac922eb7772356d541492b505a24f7ff4d97f8d38ef7" Oct 03 15:33:21 crc kubenswrapper[4774]: E1003 15:33:21.008859 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:33:34 crc kubenswrapper[4774]: I1003 15:33:34.299230 4774 scope.go:117] "RemoveContainer" containerID="cfea26b55e1d9b962f87ac922eb7772356d541492b505a24f7ff4d97f8d38ef7" Oct 03 15:33:34 crc kubenswrapper[4774]: E1003 15:33:34.300141 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:33:45 crc kubenswrapper[4774]: I1003 15:33:45.300902 4774 scope.go:117] "RemoveContainer" containerID="cfea26b55e1d9b962f87ac922eb7772356d541492b505a24f7ff4d97f8d38ef7" Oct 03 15:33:45 crc kubenswrapper[4774]: E1003 15:33:45.302060 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:33:59 crc kubenswrapper[4774]: I1003 15:33:59.319761 4774 scope.go:117] "RemoveContainer" containerID="cfea26b55e1d9b962f87ac922eb7772356d541492b505a24f7ff4d97f8d38ef7" Oct 03 15:33:59 crc kubenswrapper[4774]: E1003 15:33:59.323964 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:34:12 crc kubenswrapper[4774]: I1003 15:34:12.301218 4774 scope.go:117] "RemoveContainer" containerID="cfea26b55e1d9b962f87ac922eb7772356d541492b505a24f7ff4d97f8d38ef7" Oct 03 15:34:12 crc kubenswrapper[4774]: E1003 15:34:12.302231 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:34:25 crc kubenswrapper[4774]: I1003 15:34:25.299958 4774 scope.go:117] "RemoveContainer" containerID="cfea26b55e1d9b962f87ac922eb7772356d541492b505a24f7ff4d97f8d38ef7" Oct 03 15:34:25 crc kubenswrapper[4774]: E1003 15:34:25.300627 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:34:38 crc kubenswrapper[4774]: I1003 15:34:38.301348 4774 scope.go:117] "RemoveContainer" containerID="cfea26b55e1d9b962f87ac922eb7772356d541492b505a24f7ff4d97f8d38ef7" Oct 03 15:34:38 crc kubenswrapper[4774]: E1003 15:34:38.302203 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:34:50 crc kubenswrapper[4774]: I1003 15:34:50.299244 4774 scope.go:117] "RemoveContainer" containerID="cfea26b55e1d9b962f87ac922eb7772356d541492b505a24f7ff4d97f8d38ef7" Oct 03 15:34:50 crc kubenswrapper[4774]: E1003 15:34:50.299929 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:35:01 crc kubenswrapper[4774]: I1003 15:35:01.298955 4774 scope.go:117] "RemoveContainer" containerID="cfea26b55e1d9b962f87ac922eb7772356d541492b505a24f7ff4d97f8d38ef7" Oct 03 15:35:01 crc kubenswrapper[4774]: E1003 15:35:01.299720 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:35:14 crc kubenswrapper[4774]: I1003 15:35:14.299166 4774 scope.go:117] "RemoveContainer" containerID="cfea26b55e1d9b962f87ac922eb7772356d541492b505a24f7ff4d97f8d38ef7" Oct 03 15:35:14 crc kubenswrapper[4774]: E1003 15:35:14.299977 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:35:25 crc kubenswrapper[4774]: I1003 15:35:25.300040 4774 scope.go:117] "RemoveContainer" containerID="cfea26b55e1d9b962f87ac922eb7772356d541492b505a24f7ff4d97f8d38ef7" Oct 03 15:35:25 crc kubenswrapper[4774]: E1003 15:35:25.301439 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:35:38 crc kubenswrapper[4774]: I1003 15:35:38.300023 4774 scope.go:117] "RemoveContainer" containerID="cfea26b55e1d9b962f87ac922eb7772356d541492b505a24f7ff4d97f8d38ef7" Oct 03 15:35:38 crc kubenswrapper[4774]: E1003 15:35:38.300859 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:35:50 crc kubenswrapper[4774]: I1003 15:35:50.299793 4774 scope.go:117] "RemoveContainer" containerID="cfea26b55e1d9b962f87ac922eb7772356d541492b505a24f7ff4d97f8d38ef7" Oct 03 15:35:50 crc kubenswrapper[4774]: E1003 15:35:50.300981 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:36:01 crc kubenswrapper[4774]: I1003 15:36:01.299011 4774 scope.go:117] "RemoveContainer" containerID="cfea26b55e1d9b962f87ac922eb7772356d541492b505a24f7ff4d97f8d38ef7" Oct 03 15:36:01 crc kubenswrapper[4774]: E1003 15:36:01.299816 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:36:14 crc kubenswrapper[4774]: I1003 15:36:14.300884 4774 scope.go:117] "RemoveContainer" containerID="cfea26b55e1d9b962f87ac922eb7772356d541492b505a24f7ff4d97f8d38ef7" Oct 03 15:36:14 crc kubenswrapper[4774]: E1003 15:36:14.302058 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:36:26 crc kubenswrapper[4774]: I1003 15:36:26.300246 4774 scope.go:117] "RemoveContainer" containerID="cfea26b55e1d9b962f87ac922eb7772356d541492b505a24f7ff4d97f8d38ef7" Oct 03 15:36:26 crc kubenswrapper[4774]: E1003 15:36:26.301562 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:36:37 crc kubenswrapper[4774]: I1003 15:36:37.299718 4774 scope.go:117] "RemoveContainer" containerID="cfea26b55e1d9b962f87ac922eb7772356d541492b505a24f7ff4d97f8d38ef7" Oct 03 15:36:37 crc kubenswrapper[4774]: E1003 15:36:37.301206 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:36:51 crc kubenswrapper[4774]: I1003 15:36:51.300006 4774 scope.go:117] "RemoveContainer" containerID="cfea26b55e1d9b962f87ac922eb7772356d541492b505a24f7ff4d97f8d38ef7" Oct 03 15:36:51 crc kubenswrapper[4774]: E1003 15:36:51.301148 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:37:02 crc kubenswrapper[4774]: I1003 15:37:02.300161 4774 scope.go:117] "RemoveContainer" containerID="cfea26b55e1d9b962f87ac922eb7772356d541492b505a24f7ff4d97f8d38ef7" Oct 03 15:37:02 crc kubenswrapper[4774]: E1003 15:37:02.302531 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:37:16 crc kubenswrapper[4774]: I1003 15:37:16.641711 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tj6db"] Oct 03 15:37:16 crc kubenswrapper[4774]: E1003 15:37:16.642891 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9938b350-5406-4a55-9da7-5b084da44eb0" containerName="extract-content" Oct 03 15:37:16 crc kubenswrapper[4774]: I1003 15:37:16.642907 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="9938b350-5406-4a55-9da7-5b084da44eb0" containerName="extract-content" Oct 03 15:37:16 crc kubenswrapper[4774]: E1003 15:37:16.642923 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9938b350-5406-4a55-9da7-5b084da44eb0" containerName="registry-server" Oct 03 15:37:16 crc kubenswrapper[4774]: I1003 15:37:16.642929 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="9938b350-5406-4a55-9da7-5b084da44eb0" containerName="registry-server" Oct 03 15:37:16 crc kubenswrapper[4774]: E1003 15:37:16.642963 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9938b350-5406-4a55-9da7-5b084da44eb0" containerName="extract-utilities" Oct 03 15:37:16 crc kubenswrapper[4774]: I1003 15:37:16.642970 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="9938b350-5406-4a55-9da7-5b084da44eb0" containerName="extract-utilities" Oct 03 15:37:16 crc kubenswrapper[4774]: I1003 15:37:16.643168 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="9938b350-5406-4a55-9da7-5b084da44eb0" containerName="registry-server" Oct 03 15:37:16 crc kubenswrapper[4774]: I1003 15:37:16.644577 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tj6db" Oct 03 15:37:16 crc kubenswrapper[4774]: I1003 15:37:16.652660 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tj6db"] Oct 03 15:37:16 crc kubenswrapper[4774]: I1003 15:37:16.741925 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76578014-bbd7-4bf5-9a95-6621514c4e59-catalog-content\") pod \"redhat-operators-tj6db\" (UID: \"76578014-bbd7-4bf5-9a95-6621514c4e59\") " pod="openshift-marketplace/redhat-operators-tj6db" Oct 03 15:37:16 crc kubenswrapper[4774]: I1003 15:37:16.742437 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76578014-bbd7-4bf5-9a95-6621514c4e59-utilities\") pod \"redhat-operators-tj6db\" (UID: \"76578014-bbd7-4bf5-9a95-6621514c4e59\") " pod="openshift-marketplace/redhat-operators-tj6db" Oct 03 15:37:16 crc kubenswrapper[4774]: I1003 15:37:16.742899 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pct8h\" (UniqueName: \"kubernetes.io/projected/76578014-bbd7-4bf5-9a95-6621514c4e59-kube-api-access-pct8h\") pod \"redhat-operators-tj6db\" (UID: \"76578014-bbd7-4bf5-9a95-6621514c4e59\") " pod="openshift-marketplace/redhat-operators-tj6db" Oct 03 15:37:16 crc kubenswrapper[4774]: I1003 15:37:16.844497 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pct8h\" (UniqueName: \"kubernetes.io/projected/76578014-bbd7-4bf5-9a95-6621514c4e59-kube-api-access-pct8h\") pod \"redhat-operators-tj6db\" (UID: \"76578014-bbd7-4bf5-9a95-6621514c4e59\") " pod="openshift-marketplace/redhat-operators-tj6db" Oct 03 15:37:16 crc kubenswrapper[4774]: I1003 15:37:16.844565 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76578014-bbd7-4bf5-9a95-6621514c4e59-catalog-content\") pod \"redhat-operators-tj6db\" (UID: \"76578014-bbd7-4bf5-9a95-6621514c4e59\") " pod="openshift-marketplace/redhat-operators-tj6db" Oct 03 15:37:16 crc kubenswrapper[4774]: I1003 15:37:16.844606 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76578014-bbd7-4bf5-9a95-6621514c4e59-utilities\") pod \"redhat-operators-tj6db\" (UID: \"76578014-bbd7-4bf5-9a95-6621514c4e59\") " pod="openshift-marketplace/redhat-operators-tj6db" Oct 03 15:37:16 crc kubenswrapper[4774]: I1003 15:37:16.845101 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76578014-bbd7-4bf5-9a95-6621514c4e59-utilities\") pod \"redhat-operators-tj6db\" (UID: \"76578014-bbd7-4bf5-9a95-6621514c4e59\") " pod="openshift-marketplace/redhat-operators-tj6db" Oct 03 15:37:16 crc kubenswrapper[4774]: I1003 15:37:16.845173 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76578014-bbd7-4bf5-9a95-6621514c4e59-catalog-content\") pod \"redhat-operators-tj6db\" (UID: \"76578014-bbd7-4bf5-9a95-6621514c4e59\") " pod="openshift-marketplace/redhat-operators-tj6db" Oct 03 15:37:16 crc kubenswrapper[4774]: I1003 15:37:16.862790 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pct8h\" (UniqueName: \"kubernetes.io/projected/76578014-bbd7-4bf5-9a95-6621514c4e59-kube-api-access-pct8h\") pod \"redhat-operators-tj6db\" (UID: \"76578014-bbd7-4bf5-9a95-6621514c4e59\") " pod="openshift-marketplace/redhat-operators-tj6db" Oct 03 15:37:16 crc kubenswrapper[4774]: I1003 15:37:16.967656 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tj6db" Oct 03 15:37:17 crc kubenswrapper[4774]: I1003 15:37:17.299910 4774 scope.go:117] "RemoveContainer" containerID="cfea26b55e1d9b962f87ac922eb7772356d541492b505a24f7ff4d97f8d38ef7" Oct 03 15:37:17 crc kubenswrapper[4774]: E1003 15:37:17.300711 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:37:17 crc kubenswrapper[4774]: I1003 15:37:17.435006 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tj6db"] Oct 03 15:37:17 crc kubenswrapper[4774]: I1003 15:37:17.458434 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tj6db" event={"ID":"76578014-bbd7-4bf5-9a95-6621514c4e59","Type":"ContainerStarted","Data":"8d63d763c2d9e7cf207a0ff744d5e1f32f945f46835a180e651d1af9b90dbe9c"} Oct 03 15:37:18 crc kubenswrapper[4774]: I1003 15:37:18.472209 4774 generic.go:334] "Generic (PLEG): container finished" podID="76578014-bbd7-4bf5-9a95-6621514c4e59" containerID="f725016dd750c0b4aad32145d1c69c7fe7f99650ed84e37ca67dfa0c4dc4fe44" exitCode=0 Oct 03 15:37:18 crc kubenswrapper[4774]: I1003 15:37:18.472309 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tj6db" event={"ID":"76578014-bbd7-4bf5-9a95-6621514c4e59","Type":"ContainerDied","Data":"f725016dd750c0b4aad32145d1c69c7fe7f99650ed84e37ca67dfa0c4dc4fe44"} Oct 03 15:37:18 crc kubenswrapper[4774]: I1003 15:37:18.476620 4774 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 15:37:20 crc kubenswrapper[4774]: I1003 15:37:20.497277 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tj6db" event={"ID":"76578014-bbd7-4bf5-9a95-6621514c4e59","Type":"ContainerStarted","Data":"be68ae402465815194cd9247db73e3f24484ce3db0872a86537dd8b0f0b05e9f"} Oct 03 15:37:22 crc kubenswrapper[4774]: I1003 15:37:22.542938 4774 generic.go:334] "Generic (PLEG): container finished" podID="76578014-bbd7-4bf5-9a95-6621514c4e59" containerID="be68ae402465815194cd9247db73e3f24484ce3db0872a86537dd8b0f0b05e9f" exitCode=0 Oct 03 15:37:22 crc kubenswrapper[4774]: I1003 15:37:22.543039 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tj6db" event={"ID":"76578014-bbd7-4bf5-9a95-6621514c4e59","Type":"ContainerDied","Data":"be68ae402465815194cd9247db73e3f24484ce3db0872a86537dd8b0f0b05e9f"} Oct 03 15:37:23 crc kubenswrapper[4774]: I1003 15:37:23.555200 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tj6db" event={"ID":"76578014-bbd7-4bf5-9a95-6621514c4e59","Type":"ContainerStarted","Data":"cd0cb068ee1a819b213f7fbb65e33eafe493dcf9805c1683afb060aeb3879a8d"} Oct 03 15:37:23 crc kubenswrapper[4774]: I1003 15:37:23.579232 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tj6db" podStartSLOduration=2.949828896 podStartE2EDuration="7.579205461s" podCreationTimestamp="2025-10-03 15:37:16 +0000 UTC" firstStartedPulling="2025-10-03 15:37:18.476177108 +0000 UTC m=+3261.065380570" lastFinishedPulling="2025-10-03 15:37:23.105553643 +0000 UTC m=+3265.694757135" observedRunningTime="2025-10-03 15:37:23.571217162 +0000 UTC m=+3266.160420624" watchObservedRunningTime="2025-10-03 15:37:23.579205461 +0000 UTC m=+3266.168408943" Oct 03 15:37:26 crc kubenswrapper[4774]: I1003 15:37:26.968229 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tj6db" Oct 03 15:37:26 crc kubenswrapper[4774]: I1003 15:37:26.968754 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tj6db" Oct 03 15:37:28 crc kubenswrapper[4774]: I1003 15:37:28.019560 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tj6db" podUID="76578014-bbd7-4bf5-9a95-6621514c4e59" containerName="registry-server" probeResult="failure" output=< Oct 03 15:37:28 crc kubenswrapper[4774]: timeout: failed to connect service ":50051" within 1s Oct 03 15:37:28 crc kubenswrapper[4774]: > Oct 03 15:37:32 crc kubenswrapper[4774]: I1003 15:37:32.299763 4774 scope.go:117] "RemoveContainer" containerID="cfea26b55e1d9b962f87ac922eb7772356d541492b505a24f7ff4d97f8d38ef7" Oct 03 15:37:32 crc kubenswrapper[4774]: E1003 15:37:32.300398 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:37:37 crc kubenswrapper[4774]: I1003 15:37:37.064051 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tj6db" Oct 03 15:37:37 crc kubenswrapper[4774]: I1003 15:37:37.158567 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tj6db" Oct 03 15:37:37 crc kubenswrapper[4774]: I1003 15:37:37.319097 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tj6db"] Oct 03 15:37:38 crc kubenswrapper[4774]: I1003 15:37:38.707934 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tj6db" podUID="76578014-bbd7-4bf5-9a95-6621514c4e59" containerName="registry-server" containerID="cri-o://cd0cb068ee1a819b213f7fbb65e33eafe493dcf9805c1683afb060aeb3879a8d" gracePeriod=2 Oct 03 15:37:39 crc kubenswrapper[4774]: I1003 15:37:39.207871 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tj6db" Oct 03 15:37:39 crc kubenswrapper[4774]: I1003 15:37:39.309548 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76578014-bbd7-4bf5-9a95-6621514c4e59-catalog-content\") pod \"76578014-bbd7-4bf5-9a95-6621514c4e59\" (UID: \"76578014-bbd7-4bf5-9a95-6621514c4e59\") " Oct 03 15:37:39 crc kubenswrapper[4774]: I1003 15:37:39.309932 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pct8h\" (UniqueName: \"kubernetes.io/projected/76578014-bbd7-4bf5-9a95-6621514c4e59-kube-api-access-pct8h\") pod \"76578014-bbd7-4bf5-9a95-6621514c4e59\" (UID: \"76578014-bbd7-4bf5-9a95-6621514c4e59\") " Oct 03 15:37:39 crc kubenswrapper[4774]: I1003 15:37:39.310139 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76578014-bbd7-4bf5-9a95-6621514c4e59-utilities\") pod \"76578014-bbd7-4bf5-9a95-6621514c4e59\" (UID: \"76578014-bbd7-4bf5-9a95-6621514c4e59\") " Oct 03 15:37:39 crc kubenswrapper[4774]: I1003 15:37:39.311177 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76578014-bbd7-4bf5-9a95-6621514c4e59-utilities" (OuterVolumeSpecName: "utilities") pod "76578014-bbd7-4bf5-9a95-6621514c4e59" (UID: "76578014-bbd7-4bf5-9a95-6621514c4e59"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:37:39 crc kubenswrapper[4774]: I1003 15:37:39.317114 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76578014-bbd7-4bf5-9a95-6621514c4e59-kube-api-access-pct8h" (OuterVolumeSpecName: "kube-api-access-pct8h") pod "76578014-bbd7-4bf5-9a95-6621514c4e59" (UID: "76578014-bbd7-4bf5-9a95-6621514c4e59"). InnerVolumeSpecName "kube-api-access-pct8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:37:39 crc kubenswrapper[4774]: I1003 15:37:39.397022 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76578014-bbd7-4bf5-9a95-6621514c4e59-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "76578014-bbd7-4bf5-9a95-6621514c4e59" (UID: "76578014-bbd7-4bf5-9a95-6621514c4e59"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:37:39 crc kubenswrapper[4774]: I1003 15:37:39.412314 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76578014-bbd7-4bf5-9a95-6621514c4e59-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 15:37:39 crc kubenswrapper[4774]: I1003 15:37:39.412347 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76578014-bbd7-4bf5-9a95-6621514c4e59-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 15:37:39 crc kubenswrapper[4774]: I1003 15:37:39.412357 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pct8h\" (UniqueName: \"kubernetes.io/projected/76578014-bbd7-4bf5-9a95-6621514c4e59-kube-api-access-pct8h\") on node \"crc\" DevicePath \"\"" Oct 03 15:37:39 crc kubenswrapper[4774]: I1003 15:37:39.725213 4774 generic.go:334] "Generic (PLEG): container finished" podID="76578014-bbd7-4bf5-9a95-6621514c4e59" containerID="cd0cb068ee1a819b213f7fbb65e33eafe493dcf9805c1683afb060aeb3879a8d" exitCode=0 Oct 03 15:37:39 crc kubenswrapper[4774]: I1003 15:37:39.725309 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tj6db" Oct 03 15:37:39 crc kubenswrapper[4774]: I1003 15:37:39.725572 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tj6db" event={"ID":"76578014-bbd7-4bf5-9a95-6621514c4e59","Type":"ContainerDied","Data":"cd0cb068ee1a819b213f7fbb65e33eafe493dcf9805c1683afb060aeb3879a8d"} Oct 03 15:37:39 crc kubenswrapper[4774]: I1003 15:37:39.726617 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tj6db" event={"ID":"76578014-bbd7-4bf5-9a95-6621514c4e59","Type":"ContainerDied","Data":"8d63d763c2d9e7cf207a0ff744d5e1f32f945f46835a180e651d1af9b90dbe9c"} Oct 03 15:37:39 crc kubenswrapper[4774]: I1003 15:37:39.726665 4774 scope.go:117] "RemoveContainer" containerID="cd0cb068ee1a819b213f7fbb65e33eafe493dcf9805c1683afb060aeb3879a8d" Oct 03 15:37:39 crc kubenswrapper[4774]: I1003 15:37:39.776219 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tj6db"] Oct 03 15:37:39 crc kubenswrapper[4774]: I1003 15:37:39.779136 4774 scope.go:117] "RemoveContainer" containerID="be68ae402465815194cd9247db73e3f24484ce3db0872a86537dd8b0f0b05e9f" Oct 03 15:37:39 crc kubenswrapper[4774]: I1003 15:37:39.792198 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tj6db"] Oct 03 15:37:39 crc kubenswrapper[4774]: I1003 15:37:39.809572 4774 scope.go:117] "RemoveContainer" containerID="f725016dd750c0b4aad32145d1c69c7fe7f99650ed84e37ca67dfa0c4dc4fe44" Oct 03 15:37:39 crc kubenswrapper[4774]: I1003 15:37:39.888792 4774 scope.go:117] "RemoveContainer" containerID="cd0cb068ee1a819b213f7fbb65e33eafe493dcf9805c1683afb060aeb3879a8d" Oct 03 15:37:39 crc kubenswrapper[4774]: E1003 15:37:39.889274 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd0cb068ee1a819b213f7fbb65e33eafe493dcf9805c1683afb060aeb3879a8d\": container with ID starting with cd0cb068ee1a819b213f7fbb65e33eafe493dcf9805c1683afb060aeb3879a8d not found: ID does not exist" containerID="cd0cb068ee1a819b213f7fbb65e33eafe493dcf9805c1683afb060aeb3879a8d" Oct 03 15:37:39 crc kubenswrapper[4774]: I1003 15:37:39.889327 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd0cb068ee1a819b213f7fbb65e33eafe493dcf9805c1683afb060aeb3879a8d"} err="failed to get container status \"cd0cb068ee1a819b213f7fbb65e33eafe493dcf9805c1683afb060aeb3879a8d\": rpc error: code = NotFound desc = could not find container \"cd0cb068ee1a819b213f7fbb65e33eafe493dcf9805c1683afb060aeb3879a8d\": container with ID starting with cd0cb068ee1a819b213f7fbb65e33eafe493dcf9805c1683afb060aeb3879a8d not found: ID does not exist" Oct 03 15:37:39 crc kubenswrapper[4774]: I1003 15:37:39.889361 4774 scope.go:117] "RemoveContainer" containerID="be68ae402465815194cd9247db73e3f24484ce3db0872a86537dd8b0f0b05e9f" Oct 03 15:37:39 crc kubenswrapper[4774]: E1003 15:37:39.890454 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be68ae402465815194cd9247db73e3f24484ce3db0872a86537dd8b0f0b05e9f\": container with ID starting with be68ae402465815194cd9247db73e3f24484ce3db0872a86537dd8b0f0b05e9f not found: ID does not exist" containerID="be68ae402465815194cd9247db73e3f24484ce3db0872a86537dd8b0f0b05e9f" Oct 03 15:37:39 crc kubenswrapper[4774]: I1003 15:37:39.890508 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be68ae402465815194cd9247db73e3f24484ce3db0872a86537dd8b0f0b05e9f"} err="failed to get container status \"be68ae402465815194cd9247db73e3f24484ce3db0872a86537dd8b0f0b05e9f\": rpc error: code = NotFound desc = could not find container \"be68ae402465815194cd9247db73e3f24484ce3db0872a86537dd8b0f0b05e9f\": container with ID starting with be68ae402465815194cd9247db73e3f24484ce3db0872a86537dd8b0f0b05e9f not found: ID does not exist" Oct 03 15:37:39 crc kubenswrapper[4774]: I1003 15:37:39.890554 4774 scope.go:117] "RemoveContainer" containerID="f725016dd750c0b4aad32145d1c69c7fe7f99650ed84e37ca67dfa0c4dc4fe44" Oct 03 15:37:39 crc kubenswrapper[4774]: E1003 15:37:39.891082 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f725016dd750c0b4aad32145d1c69c7fe7f99650ed84e37ca67dfa0c4dc4fe44\": container with ID starting with f725016dd750c0b4aad32145d1c69c7fe7f99650ed84e37ca67dfa0c4dc4fe44 not found: ID does not exist" containerID="f725016dd750c0b4aad32145d1c69c7fe7f99650ed84e37ca67dfa0c4dc4fe44" Oct 03 15:37:39 crc kubenswrapper[4774]: I1003 15:37:39.891126 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f725016dd750c0b4aad32145d1c69c7fe7f99650ed84e37ca67dfa0c4dc4fe44"} err="failed to get container status \"f725016dd750c0b4aad32145d1c69c7fe7f99650ed84e37ca67dfa0c4dc4fe44\": rpc error: code = NotFound desc = could not find container \"f725016dd750c0b4aad32145d1c69c7fe7f99650ed84e37ca67dfa0c4dc4fe44\": container with ID starting with f725016dd750c0b4aad32145d1c69c7fe7f99650ed84e37ca67dfa0c4dc4fe44 not found: ID does not exist" Oct 03 15:37:41 crc kubenswrapper[4774]: I1003 15:37:41.313242 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76578014-bbd7-4bf5-9a95-6621514c4e59" path="/var/lib/kubelet/pods/76578014-bbd7-4bf5-9a95-6621514c4e59/volumes" Oct 03 15:37:47 crc kubenswrapper[4774]: I1003 15:37:47.301207 4774 scope.go:117] "RemoveContainer" containerID="cfea26b55e1d9b962f87ac922eb7772356d541492b505a24f7ff4d97f8d38ef7" Oct 03 15:37:47 crc kubenswrapper[4774]: E1003 15:37:47.302289 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:38:01 crc kubenswrapper[4774]: I1003 15:38:01.299434 4774 scope.go:117] "RemoveContainer" containerID="cfea26b55e1d9b962f87ac922eb7772356d541492b505a24f7ff4d97f8d38ef7" Oct 03 15:38:01 crc kubenswrapper[4774]: E1003 15:38:01.300725 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:38:14 crc kubenswrapper[4774]: I1003 15:38:14.299857 4774 scope.go:117] "RemoveContainer" containerID="cfea26b55e1d9b962f87ac922eb7772356d541492b505a24f7ff4d97f8d38ef7" Oct 03 15:38:14 crc kubenswrapper[4774]: E1003 15:38:14.300649 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:38:28 crc kubenswrapper[4774]: I1003 15:38:28.300123 4774 scope.go:117] "RemoveContainer" containerID="cfea26b55e1d9b962f87ac922eb7772356d541492b505a24f7ff4d97f8d38ef7" Oct 03 15:38:29 crc kubenswrapper[4774]: I1003 15:38:29.281762 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" event={"ID":"ca37ac4b-f421-4198-a179-12901d36f0f5","Type":"ContainerStarted","Data":"f2ac2522d841de5c1ea1ba89a06aceeed8a354385c6834a910301458a95a43d3"} Oct 03 15:39:31 crc kubenswrapper[4774]: I1003 15:39:31.378647 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-45xr9"] Oct 03 15:39:31 crc kubenswrapper[4774]: E1003 15:39:31.379633 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76578014-bbd7-4bf5-9a95-6621514c4e59" containerName="registry-server" Oct 03 15:39:31 crc kubenswrapper[4774]: I1003 15:39:31.379652 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="76578014-bbd7-4bf5-9a95-6621514c4e59" containerName="registry-server" Oct 03 15:39:31 crc kubenswrapper[4774]: E1003 15:39:31.379688 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76578014-bbd7-4bf5-9a95-6621514c4e59" containerName="extract-content" Oct 03 15:39:31 crc kubenswrapper[4774]: I1003 15:39:31.379696 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="76578014-bbd7-4bf5-9a95-6621514c4e59" containerName="extract-content" Oct 03 15:39:31 crc kubenswrapper[4774]: E1003 15:39:31.379721 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76578014-bbd7-4bf5-9a95-6621514c4e59" containerName="extract-utilities" Oct 03 15:39:31 crc kubenswrapper[4774]: I1003 15:39:31.379730 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="76578014-bbd7-4bf5-9a95-6621514c4e59" containerName="extract-utilities" Oct 03 15:39:31 crc kubenswrapper[4774]: I1003 15:39:31.379951 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="76578014-bbd7-4bf5-9a95-6621514c4e59" containerName="registry-server" Oct 03 15:39:31 crc kubenswrapper[4774]: I1003 15:39:31.381737 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-45xr9" Oct 03 15:39:31 crc kubenswrapper[4774]: I1003 15:39:31.415826 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-45xr9"] Oct 03 15:39:31 crc kubenswrapper[4774]: I1003 15:39:31.476097 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgrwl\" (UniqueName: \"kubernetes.io/projected/77726354-17d3-41d4-b1c8-5768a06024fb-kube-api-access-hgrwl\") pod \"community-operators-45xr9\" (UID: \"77726354-17d3-41d4-b1c8-5768a06024fb\") " pod="openshift-marketplace/community-operators-45xr9" Oct 03 15:39:31 crc kubenswrapper[4774]: I1003 15:39:31.476360 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77726354-17d3-41d4-b1c8-5768a06024fb-catalog-content\") pod \"community-operators-45xr9\" (UID: \"77726354-17d3-41d4-b1c8-5768a06024fb\") " pod="openshift-marketplace/community-operators-45xr9" Oct 03 15:39:31 crc kubenswrapper[4774]: I1003 15:39:31.476414 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77726354-17d3-41d4-b1c8-5768a06024fb-utilities\") pod \"community-operators-45xr9\" (UID: \"77726354-17d3-41d4-b1c8-5768a06024fb\") " pod="openshift-marketplace/community-operators-45xr9" Oct 03 15:39:31 crc kubenswrapper[4774]: I1003 15:39:31.577688 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77726354-17d3-41d4-b1c8-5768a06024fb-catalog-content\") pod \"community-operators-45xr9\" (UID: \"77726354-17d3-41d4-b1c8-5768a06024fb\") " pod="openshift-marketplace/community-operators-45xr9" Oct 03 15:39:31 crc kubenswrapper[4774]: I1003 15:39:31.577757 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77726354-17d3-41d4-b1c8-5768a06024fb-utilities\") pod \"community-operators-45xr9\" (UID: \"77726354-17d3-41d4-b1c8-5768a06024fb\") " pod="openshift-marketplace/community-operators-45xr9" Oct 03 15:39:31 crc kubenswrapper[4774]: I1003 15:39:31.577790 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgrwl\" (UniqueName: \"kubernetes.io/projected/77726354-17d3-41d4-b1c8-5768a06024fb-kube-api-access-hgrwl\") pod \"community-operators-45xr9\" (UID: \"77726354-17d3-41d4-b1c8-5768a06024fb\") " pod="openshift-marketplace/community-operators-45xr9" Oct 03 15:39:31 crc kubenswrapper[4774]: I1003 15:39:31.578225 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77726354-17d3-41d4-b1c8-5768a06024fb-catalog-content\") pod \"community-operators-45xr9\" (UID: \"77726354-17d3-41d4-b1c8-5768a06024fb\") " pod="openshift-marketplace/community-operators-45xr9" Oct 03 15:39:31 crc kubenswrapper[4774]: I1003 15:39:31.578242 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77726354-17d3-41d4-b1c8-5768a06024fb-utilities\") pod \"community-operators-45xr9\" (UID: \"77726354-17d3-41d4-b1c8-5768a06024fb\") " pod="openshift-marketplace/community-operators-45xr9" Oct 03 15:39:31 crc kubenswrapper[4774]: I1003 15:39:31.601072 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgrwl\" (UniqueName: \"kubernetes.io/projected/77726354-17d3-41d4-b1c8-5768a06024fb-kube-api-access-hgrwl\") pod \"community-operators-45xr9\" (UID: \"77726354-17d3-41d4-b1c8-5768a06024fb\") " pod="openshift-marketplace/community-operators-45xr9" Oct 03 15:39:31 crc kubenswrapper[4774]: I1003 15:39:31.716861 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-45xr9" Oct 03 15:39:32 crc kubenswrapper[4774]: I1003 15:39:32.231771 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-45xr9"] Oct 03 15:39:33 crc kubenswrapper[4774]: I1003 15:39:33.004037 4774 generic.go:334] "Generic (PLEG): container finished" podID="77726354-17d3-41d4-b1c8-5768a06024fb" containerID="3ef4ae44c4105c7b941e6dd32f63acdadde5a16ada83fff02835d1d255305bd8" exitCode=0 Oct 03 15:39:33 crc kubenswrapper[4774]: I1003 15:39:33.004151 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45xr9" event={"ID":"77726354-17d3-41d4-b1c8-5768a06024fb","Type":"ContainerDied","Data":"3ef4ae44c4105c7b941e6dd32f63acdadde5a16ada83fff02835d1d255305bd8"} Oct 03 15:39:33 crc kubenswrapper[4774]: I1003 15:39:33.004534 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45xr9" event={"ID":"77726354-17d3-41d4-b1c8-5768a06024fb","Type":"ContainerStarted","Data":"f5f8ee6f841d9aee50a542cc7b098c3559a984d359440f5a0ab4dfbefcfb05bf"} Oct 03 15:39:33 crc kubenswrapper[4774]: I1003 15:39:33.943880 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dkzzp"] Oct 03 15:39:33 crc kubenswrapper[4774]: I1003 15:39:33.946395 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dkzzp" Oct 03 15:39:33 crc kubenswrapper[4774]: I1003 15:39:33.956642 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dkzzp"] Oct 03 15:39:34 crc kubenswrapper[4774]: I1003 15:39:34.016234 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45xr9" event={"ID":"77726354-17d3-41d4-b1c8-5768a06024fb","Type":"ContainerStarted","Data":"b0f606ff86b0858d9a5b8cd8b00697b9dae1177db5504ca8dc3aaa07b831cfd2"} Oct 03 15:39:34 crc kubenswrapper[4774]: I1003 15:39:34.131170 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqkc2\" (UniqueName: \"kubernetes.io/projected/23ff1d5e-ccb1-4705-9d02-a2ba1359c04b-kube-api-access-sqkc2\") pod \"certified-operators-dkzzp\" (UID: \"23ff1d5e-ccb1-4705-9d02-a2ba1359c04b\") " pod="openshift-marketplace/certified-operators-dkzzp" Oct 03 15:39:34 crc kubenswrapper[4774]: I1003 15:39:34.131221 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23ff1d5e-ccb1-4705-9d02-a2ba1359c04b-catalog-content\") pod \"certified-operators-dkzzp\" (UID: \"23ff1d5e-ccb1-4705-9d02-a2ba1359c04b\") " pod="openshift-marketplace/certified-operators-dkzzp" Oct 03 15:39:34 crc kubenswrapper[4774]: I1003 15:39:34.131497 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23ff1d5e-ccb1-4705-9d02-a2ba1359c04b-utilities\") pod \"certified-operators-dkzzp\" (UID: \"23ff1d5e-ccb1-4705-9d02-a2ba1359c04b\") " pod="openshift-marketplace/certified-operators-dkzzp" Oct 03 15:39:34 crc kubenswrapper[4774]: I1003 15:39:34.232935 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23ff1d5e-ccb1-4705-9d02-a2ba1359c04b-utilities\") pod \"certified-operators-dkzzp\" (UID: \"23ff1d5e-ccb1-4705-9d02-a2ba1359c04b\") " pod="openshift-marketplace/certified-operators-dkzzp" Oct 03 15:39:34 crc kubenswrapper[4774]: I1003 15:39:34.232982 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqkc2\" (UniqueName: \"kubernetes.io/projected/23ff1d5e-ccb1-4705-9d02-a2ba1359c04b-kube-api-access-sqkc2\") pod \"certified-operators-dkzzp\" (UID: \"23ff1d5e-ccb1-4705-9d02-a2ba1359c04b\") " pod="openshift-marketplace/certified-operators-dkzzp" Oct 03 15:39:34 crc kubenswrapper[4774]: I1003 15:39:34.233009 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23ff1d5e-ccb1-4705-9d02-a2ba1359c04b-catalog-content\") pod \"certified-operators-dkzzp\" (UID: \"23ff1d5e-ccb1-4705-9d02-a2ba1359c04b\") " pod="openshift-marketplace/certified-operators-dkzzp" Oct 03 15:39:34 crc kubenswrapper[4774]: I1003 15:39:34.233462 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23ff1d5e-ccb1-4705-9d02-a2ba1359c04b-utilities\") pod \"certified-operators-dkzzp\" (UID: \"23ff1d5e-ccb1-4705-9d02-a2ba1359c04b\") " pod="openshift-marketplace/certified-operators-dkzzp" Oct 03 15:39:34 crc kubenswrapper[4774]: I1003 15:39:34.233481 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23ff1d5e-ccb1-4705-9d02-a2ba1359c04b-catalog-content\") pod \"certified-operators-dkzzp\" (UID: \"23ff1d5e-ccb1-4705-9d02-a2ba1359c04b\") " pod="openshift-marketplace/certified-operators-dkzzp" Oct 03 15:39:34 crc kubenswrapper[4774]: I1003 15:39:34.259207 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqkc2\" (UniqueName: \"kubernetes.io/projected/23ff1d5e-ccb1-4705-9d02-a2ba1359c04b-kube-api-access-sqkc2\") pod \"certified-operators-dkzzp\" (UID: \"23ff1d5e-ccb1-4705-9d02-a2ba1359c04b\") " pod="openshift-marketplace/certified-operators-dkzzp" Oct 03 15:39:34 crc kubenswrapper[4774]: I1003 15:39:34.263082 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dkzzp" Oct 03 15:39:34 crc kubenswrapper[4774]: W1003 15:39:34.757199 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23ff1d5e_ccb1_4705_9d02_a2ba1359c04b.slice/crio-4fbb813eb4d5b275cd36a3b2baeec216045a503428d213a84e86d37e3981a0e1 WatchSource:0}: Error finding container 4fbb813eb4d5b275cd36a3b2baeec216045a503428d213a84e86d37e3981a0e1: Status 404 returned error can't find the container with id 4fbb813eb4d5b275cd36a3b2baeec216045a503428d213a84e86d37e3981a0e1 Oct 03 15:39:34 crc kubenswrapper[4774]: I1003 15:39:34.758660 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dkzzp"] Oct 03 15:39:35 crc kubenswrapper[4774]: I1003 15:39:35.030633 4774 generic.go:334] "Generic (PLEG): container finished" podID="23ff1d5e-ccb1-4705-9d02-a2ba1359c04b" containerID="540cc7c0c6b62c3f58131a0ebd058ff27d3c08540ff40993f6a6b9534a1b957b" exitCode=0 Oct 03 15:39:35 crc kubenswrapper[4774]: I1003 15:39:35.030922 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dkzzp" event={"ID":"23ff1d5e-ccb1-4705-9d02-a2ba1359c04b","Type":"ContainerDied","Data":"540cc7c0c6b62c3f58131a0ebd058ff27d3c08540ff40993f6a6b9534a1b957b"} Oct 03 15:39:35 crc kubenswrapper[4774]: I1003 15:39:35.030951 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dkzzp" event={"ID":"23ff1d5e-ccb1-4705-9d02-a2ba1359c04b","Type":"ContainerStarted","Data":"4fbb813eb4d5b275cd36a3b2baeec216045a503428d213a84e86d37e3981a0e1"} Oct 03 15:39:35 crc kubenswrapper[4774]: I1003 15:39:35.042998 4774 generic.go:334] "Generic (PLEG): container finished" podID="77726354-17d3-41d4-b1c8-5768a06024fb" containerID="b0f606ff86b0858d9a5b8cd8b00697b9dae1177db5504ca8dc3aaa07b831cfd2" exitCode=0 Oct 03 15:39:35 crc kubenswrapper[4774]: I1003 15:39:35.043238 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45xr9" event={"ID":"77726354-17d3-41d4-b1c8-5768a06024fb","Type":"ContainerDied","Data":"b0f606ff86b0858d9a5b8cd8b00697b9dae1177db5504ca8dc3aaa07b831cfd2"} Oct 03 15:39:36 crc kubenswrapper[4774]: I1003 15:39:36.057285 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dkzzp" event={"ID":"23ff1d5e-ccb1-4705-9d02-a2ba1359c04b","Type":"ContainerStarted","Data":"f6ebd6f79a49c66106e3e6f7e46476d2fa62418026d5854c8323da1b4bc7a43d"} Oct 03 15:39:36 crc kubenswrapper[4774]: I1003 15:39:36.059636 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45xr9" event={"ID":"77726354-17d3-41d4-b1c8-5768a06024fb","Type":"ContainerStarted","Data":"dacd690c3443f7e030f8e203624e18a26f9bd1f940b6c776a1b9cbf59500b364"} Oct 03 15:39:36 crc kubenswrapper[4774]: I1003 15:39:36.096674 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-45xr9" podStartSLOduration=2.621921211 podStartE2EDuration="5.096651116s" podCreationTimestamp="2025-10-03 15:39:31 +0000 UTC" firstStartedPulling="2025-10-03 15:39:33.015689505 +0000 UTC m=+3395.604892967" lastFinishedPulling="2025-10-03 15:39:35.49041939 +0000 UTC m=+3398.079622872" observedRunningTime="2025-10-03 15:39:36.091947109 +0000 UTC m=+3398.681150591" watchObservedRunningTime="2025-10-03 15:39:36.096651116 +0000 UTC m=+3398.685854598" Oct 03 15:39:37 crc kubenswrapper[4774]: I1003 15:39:37.070542 4774 generic.go:334] "Generic (PLEG): container finished" podID="23ff1d5e-ccb1-4705-9d02-a2ba1359c04b" containerID="f6ebd6f79a49c66106e3e6f7e46476d2fa62418026d5854c8323da1b4bc7a43d" exitCode=0 Oct 03 15:39:37 crc kubenswrapper[4774]: I1003 15:39:37.070636 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dkzzp" event={"ID":"23ff1d5e-ccb1-4705-9d02-a2ba1359c04b","Type":"ContainerDied","Data":"f6ebd6f79a49c66106e3e6f7e46476d2fa62418026d5854c8323da1b4bc7a43d"} Oct 03 15:39:39 crc kubenswrapper[4774]: I1003 15:39:39.133418 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dkzzp" event={"ID":"23ff1d5e-ccb1-4705-9d02-a2ba1359c04b","Type":"ContainerStarted","Data":"905cba252ef494cc4175cfdab026997cd80d0bb3840a32d43f79e5c27a2d8145"} Oct 03 15:39:39 crc kubenswrapper[4774]: I1003 15:39:39.162048 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dkzzp" podStartSLOduration=2.851151018 podStartE2EDuration="6.16202218s" podCreationTimestamp="2025-10-03 15:39:33 +0000 UTC" firstStartedPulling="2025-10-03 15:39:35.032567207 +0000 UTC m=+3397.621770689" lastFinishedPulling="2025-10-03 15:39:38.343438359 +0000 UTC m=+3400.932641851" observedRunningTime="2025-10-03 15:39:39.15438787 +0000 UTC m=+3401.743591332" watchObservedRunningTime="2025-10-03 15:39:39.16202218 +0000 UTC m=+3401.751225642" Oct 03 15:39:41 crc kubenswrapper[4774]: I1003 15:39:41.717506 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-45xr9" Oct 03 15:39:41 crc kubenswrapper[4774]: I1003 15:39:41.717912 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-45xr9" Oct 03 15:39:41 crc kubenswrapper[4774]: I1003 15:39:41.786447 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-45xr9" Oct 03 15:39:42 crc kubenswrapper[4774]: I1003 15:39:42.224994 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-45xr9" Oct 03 15:39:42 crc kubenswrapper[4774]: I1003 15:39:42.743457 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-45xr9"] Oct 03 15:39:44 crc kubenswrapper[4774]: I1003 15:39:44.183473 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-45xr9" podUID="77726354-17d3-41d4-b1c8-5768a06024fb" containerName="registry-server" containerID="cri-o://dacd690c3443f7e030f8e203624e18a26f9bd1f940b6c776a1b9cbf59500b364" gracePeriod=2 Oct 03 15:39:44 crc kubenswrapper[4774]: I1003 15:39:44.263574 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dkzzp" Oct 03 15:39:44 crc kubenswrapper[4774]: I1003 15:39:44.264615 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dkzzp" Oct 03 15:39:44 crc kubenswrapper[4774]: I1003 15:39:44.338267 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dkzzp" Oct 03 15:39:44 crc kubenswrapper[4774]: I1003 15:39:44.800791 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-45xr9" Oct 03 15:39:44 crc kubenswrapper[4774]: I1003 15:39:44.968000 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77726354-17d3-41d4-b1c8-5768a06024fb-utilities\") pod \"77726354-17d3-41d4-b1c8-5768a06024fb\" (UID: \"77726354-17d3-41d4-b1c8-5768a06024fb\") " Oct 03 15:39:44 crc kubenswrapper[4774]: I1003 15:39:44.968520 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgrwl\" (UniqueName: \"kubernetes.io/projected/77726354-17d3-41d4-b1c8-5768a06024fb-kube-api-access-hgrwl\") pod \"77726354-17d3-41d4-b1c8-5768a06024fb\" (UID: \"77726354-17d3-41d4-b1c8-5768a06024fb\") " Oct 03 15:39:44 crc kubenswrapper[4774]: I1003 15:39:44.968712 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77726354-17d3-41d4-b1c8-5768a06024fb-catalog-content\") pod \"77726354-17d3-41d4-b1c8-5768a06024fb\" (UID: \"77726354-17d3-41d4-b1c8-5768a06024fb\") " Oct 03 15:39:44 crc kubenswrapper[4774]: I1003 15:39:44.969469 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77726354-17d3-41d4-b1c8-5768a06024fb-utilities" (OuterVolumeSpecName: "utilities") pod "77726354-17d3-41d4-b1c8-5768a06024fb" (UID: "77726354-17d3-41d4-b1c8-5768a06024fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:39:44 crc kubenswrapper[4774]: I1003 15:39:44.977489 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77726354-17d3-41d4-b1c8-5768a06024fb-kube-api-access-hgrwl" (OuterVolumeSpecName: "kube-api-access-hgrwl") pod "77726354-17d3-41d4-b1c8-5768a06024fb" (UID: "77726354-17d3-41d4-b1c8-5768a06024fb"). InnerVolumeSpecName "kube-api-access-hgrwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:39:45 crc kubenswrapper[4774]: I1003 15:39:45.027499 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77726354-17d3-41d4-b1c8-5768a06024fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "77726354-17d3-41d4-b1c8-5768a06024fb" (UID: "77726354-17d3-41d4-b1c8-5768a06024fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:39:45 crc kubenswrapper[4774]: I1003 15:39:45.071194 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgrwl\" (UniqueName: \"kubernetes.io/projected/77726354-17d3-41d4-b1c8-5768a06024fb-kube-api-access-hgrwl\") on node \"crc\" DevicePath \"\"" Oct 03 15:39:45 crc kubenswrapper[4774]: I1003 15:39:45.071236 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77726354-17d3-41d4-b1c8-5768a06024fb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 15:39:45 crc kubenswrapper[4774]: I1003 15:39:45.071257 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77726354-17d3-41d4-b1c8-5768a06024fb-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 15:39:45 crc kubenswrapper[4774]: I1003 15:39:45.196690 4774 generic.go:334] "Generic (PLEG): container finished" podID="77726354-17d3-41d4-b1c8-5768a06024fb" containerID="dacd690c3443f7e030f8e203624e18a26f9bd1f940b6c776a1b9cbf59500b364" exitCode=0 Oct 03 15:39:45 crc kubenswrapper[4774]: I1003 15:39:45.196763 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-45xr9" Oct 03 15:39:45 crc kubenswrapper[4774]: I1003 15:39:45.196814 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45xr9" event={"ID":"77726354-17d3-41d4-b1c8-5768a06024fb","Type":"ContainerDied","Data":"dacd690c3443f7e030f8e203624e18a26f9bd1f940b6c776a1b9cbf59500b364"} Oct 03 15:39:45 crc kubenswrapper[4774]: I1003 15:39:45.196844 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45xr9" event={"ID":"77726354-17d3-41d4-b1c8-5768a06024fb","Type":"ContainerDied","Data":"f5f8ee6f841d9aee50a542cc7b098c3559a984d359440f5a0ab4dfbefcfb05bf"} Oct 03 15:39:45 crc kubenswrapper[4774]: I1003 15:39:45.196860 4774 scope.go:117] "RemoveContainer" containerID="dacd690c3443f7e030f8e203624e18a26f9bd1f940b6c776a1b9cbf59500b364" Oct 03 15:39:45 crc kubenswrapper[4774]: I1003 15:39:45.221993 4774 scope.go:117] "RemoveContainer" containerID="b0f606ff86b0858d9a5b8cd8b00697b9dae1177db5504ca8dc3aaa07b831cfd2" Oct 03 15:39:45 crc kubenswrapper[4774]: I1003 15:39:45.249306 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-45xr9"] Oct 03 15:39:45 crc kubenswrapper[4774]: I1003 15:39:45.257211 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dkzzp" Oct 03 15:39:45 crc kubenswrapper[4774]: I1003 15:39:45.258303 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-45xr9"] Oct 03 15:39:45 crc kubenswrapper[4774]: I1003 15:39:45.273614 4774 scope.go:117] "RemoveContainer" containerID="3ef4ae44c4105c7b941e6dd32f63acdadde5a16ada83fff02835d1d255305bd8" Oct 03 15:39:45 crc kubenswrapper[4774]: I1003 15:39:45.304469 4774 scope.go:117] "RemoveContainer" containerID="dacd690c3443f7e030f8e203624e18a26f9bd1f940b6c776a1b9cbf59500b364" Oct 03 15:39:45 crc kubenswrapper[4774]: E1003 15:39:45.304841 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dacd690c3443f7e030f8e203624e18a26f9bd1f940b6c776a1b9cbf59500b364\": container with ID starting with dacd690c3443f7e030f8e203624e18a26f9bd1f940b6c776a1b9cbf59500b364 not found: ID does not exist" containerID="dacd690c3443f7e030f8e203624e18a26f9bd1f940b6c776a1b9cbf59500b364" Oct 03 15:39:45 crc kubenswrapper[4774]: I1003 15:39:45.304873 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dacd690c3443f7e030f8e203624e18a26f9bd1f940b6c776a1b9cbf59500b364"} err="failed to get container status \"dacd690c3443f7e030f8e203624e18a26f9bd1f940b6c776a1b9cbf59500b364\": rpc error: code = NotFound desc = could not find container \"dacd690c3443f7e030f8e203624e18a26f9bd1f940b6c776a1b9cbf59500b364\": container with ID starting with dacd690c3443f7e030f8e203624e18a26f9bd1f940b6c776a1b9cbf59500b364 not found: ID does not exist" Oct 03 15:39:45 crc kubenswrapper[4774]: I1003 15:39:45.304900 4774 scope.go:117] "RemoveContainer" containerID="b0f606ff86b0858d9a5b8cd8b00697b9dae1177db5504ca8dc3aaa07b831cfd2" Oct 03 15:39:45 crc kubenswrapper[4774]: E1003 15:39:45.305313 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0f606ff86b0858d9a5b8cd8b00697b9dae1177db5504ca8dc3aaa07b831cfd2\": container with ID starting with b0f606ff86b0858d9a5b8cd8b00697b9dae1177db5504ca8dc3aaa07b831cfd2 not found: ID does not exist" containerID="b0f606ff86b0858d9a5b8cd8b00697b9dae1177db5504ca8dc3aaa07b831cfd2" Oct 03 15:39:45 crc kubenswrapper[4774]: I1003 15:39:45.305335 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0f606ff86b0858d9a5b8cd8b00697b9dae1177db5504ca8dc3aaa07b831cfd2"} err="failed to get container status \"b0f606ff86b0858d9a5b8cd8b00697b9dae1177db5504ca8dc3aaa07b831cfd2\": rpc error: code = NotFound desc = could not find container \"b0f606ff86b0858d9a5b8cd8b00697b9dae1177db5504ca8dc3aaa07b831cfd2\": container with ID starting with b0f606ff86b0858d9a5b8cd8b00697b9dae1177db5504ca8dc3aaa07b831cfd2 not found: ID does not exist" Oct 03 15:39:45 crc kubenswrapper[4774]: I1003 15:39:45.305351 4774 scope.go:117] "RemoveContainer" containerID="3ef4ae44c4105c7b941e6dd32f63acdadde5a16ada83fff02835d1d255305bd8" Oct 03 15:39:45 crc kubenswrapper[4774]: E1003 15:39:45.305726 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ef4ae44c4105c7b941e6dd32f63acdadde5a16ada83fff02835d1d255305bd8\": container with ID starting with 3ef4ae44c4105c7b941e6dd32f63acdadde5a16ada83fff02835d1d255305bd8 not found: ID does not exist" containerID="3ef4ae44c4105c7b941e6dd32f63acdadde5a16ada83fff02835d1d255305bd8" Oct 03 15:39:45 crc kubenswrapper[4774]: I1003 15:39:45.305767 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ef4ae44c4105c7b941e6dd32f63acdadde5a16ada83fff02835d1d255305bd8"} err="failed to get container status \"3ef4ae44c4105c7b941e6dd32f63acdadde5a16ada83fff02835d1d255305bd8\": rpc error: code = NotFound desc = could not find container \"3ef4ae44c4105c7b941e6dd32f63acdadde5a16ada83fff02835d1d255305bd8\": container with ID starting with 3ef4ae44c4105c7b941e6dd32f63acdadde5a16ada83fff02835d1d255305bd8 not found: ID does not exist" Oct 03 15:39:45 crc kubenswrapper[4774]: I1003 15:39:45.311666 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77726354-17d3-41d4-b1c8-5768a06024fb" path="/var/lib/kubelet/pods/77726354-17d3-41d4-b1c8-5768a06024fb/volumes" Oct 03 15:39:47 crc kubenswrapper[4774]: I1003 15:39:47.537875 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dkzzp"] Oct 03 15:39:48 crc kubenswrapper[4774]: I1003 15:39:48.230071 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dkzzp" podUID="23ff1d5e-ccb1-4705-9d02-a2ba1359c04b" containerName="registry-server" containerID="cri-o://905cba252ef494cc4175cfdab026997cd80d0bb3840a32d43f79e5c27a2d8145" gracePeriod=2 Oct 03 15:39:48 crc kubenswrapper[4774]: I1003 15:39:48.716233 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dkzzp" Oct 03 15:39:48 crc kubenswrapper[4774]: I1003 15:39:48.855144 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23ff1d5e-ccb1-4705-9d02-a2ba1359c04b-catalog-content\") pod \"23ff1d5e-ccb1-4705-9d02-a2ba1359c04b\" (UID: \"23ff1d5e-ccb1-4705-9d02-a2ba1359c04b\") " Oct 03 15:39:48 crc kubenswrapper[4774]: I1003 15:39:48.855200 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqkc2\" (UniqueName: \"kubernetes.io/projected/23ff1d5e-ccb1-4705-9d02-a2ba1359c04b-kube-api-access-sqkc2\") pod \"23ff1d5e-ccb1-4705-9d02-a2ba1359c04b\" (UID: \"23ff1d5e-ccb1-4705-9d02-a2ba1359c04b\") " Oct 03 15:39:48 crc kubenswrapper[4774]: I1003 15:39:48.855489 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23ff1d5e-ccb1-4705-9d02-a2ba1359c04b-utilities\") pod \"23ff1d5e-ccb1-4705-9d02-a2ba1359c04b\" (UID: \"23ff1d5e-ccb1-4705-9d02-a2ba1359c04b\") " Oct 03 15:39:48 crc kubenswrapper[4774]: I1003 15:39:48.856439 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23ff1d5e-ccb1-4705-9d02-a2ba1359c04b-utilities" (OuterVolumeSpecName: "utilities") pod "23ff1d5e-ccb1-4705-9d02-a2ba1359c04b" (UID: "23ff1d5e-ccb1-4705-9d02-a2ba1359c04b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:39:48 crc kubenswrapper[4774]: I1003 15:39:48.861801 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23ff1d5e-ccb1-4705-9d02-a2ba1359c04b-kube-api-access-sqkc2" (OuterVolumeSpecName: "kube-api-access-sqkc2") pod "23ff1d5e-ccb1-4705-9d02-a2ba1359c04b" (UID: "23ff1d5e-ccb1-4705-9d02-a2ba1359c04b"). InnerVolumeSpecName "kube-api-access-sqkc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:39:48 crc kubenswrapper[4774]: I1003 15:39:48.905048 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23ff1d5e-ccb1-4705-9d02-a2ba1359c04b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "23ff1d5e-ccb1-4705-9d02-a2ba1359c04b" (UID: "23ff1d5e-ccb1-4705-9d02-a2ba1359c04b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:39:48 crc kubenswrapper[4774]: I1003 15:39:48.957874 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23ff1d5e-ccb1-4705-9d02-a2ba1359c04b-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 15:39:48 crc kubenswrapper[4774]: I1003 15:39:48.957907 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23ff1d5e-ccb1-4705-9d02-a2ba1359c04b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 15:39:48 crc kubenswrapper[4774]: I1003 15:39:48.957921 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqkc2\" (UniqueName: \"kubernetes.io/projected/23ff1d5e-ccb1-4705-9d02-a2ba1359c04b-kube-api-access-sqkc2\") on node \"crc\" DevicePath \"\"" Oct 03 15:39:49 crc kubenswrapper[4774]: I1003 15:39:49.246653 4774 generic.go:334] "Generic (PLEG): container finished" podID="23ff1d5e-ccb1-4705-9d02-a2ba1359c04b" containerID="905cba252ef494cc4175cfdab026997cd80d0bb3840a32d43f79e5c27a2d8145" exitCode=0 Oct 03 15:39:49 crc kubenswrapper[4774]: I1003 15:39:49.246745 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dkzzp" Oct 03 15:39:49 crc kubenswrapper[4774]: I1003 15:39:49.246735 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dkzzp" event={"ID":"23ff1d5e-ccb1-4705-9d02-a2ba1359c04b","Type":"ContainerDied","Data":"905cba252ef494cc4175cfdab026997cd80d0bb3840a32d43f79e5c27a2d8145"} Oct 03 15:39:49 crc kubenswrapper[4774]: I1003 15:39:49.246830 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dkzzp" event={"ID":"23ff1d5e-ccb1-4705-9d02-a2ba1359c04b","Type":"ContainerDied","Data":"4fbb813eb4d5b275cd36a3b2baeec216045a503428d213a84e86d37e3981a0e1"} Oct 03 15:39:49 crc kubenswrapper[4774]: I1003 15:39:49.246883 4774 scope.go:117] "RemoveContainer" containerID="905cba252ef494cc4175cfdab026997cd80d0bb3840a32d43f79e5c27a2d8145" Oct 03 15:39:49 crc kubenswrapper[4774]: I1003 15:39:49.279176 4774 scope.go:117] "RemoveContainer" containerID="f6ebd6f79a49c66106e3e6f7e46476d2fa62418026d5854c8323da1b4bc7a43d" Oct 03 15:39:49 crc kubenswrapper[4774]: I1003 15:39:49.315800 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dkzzp"] Oct 03 15:39:49 crc kubenswrapper[4774]: I1003 15:39:49.316808 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dkzzp"] Oct 03 15:39:49 crc kubenswrapper[4774]: I1003 15:39:49.322663 4774 scope.go:117] "RemoveContainer" containerID="540cc7c0c6b62c3f58131a0ebd058ff27d3c08540ff40993f6a6b9534a1b957b" Oct 03 15:39:49 crc kubenswrapper[4774]: I1003 15:39:49.355395 4774 scope.go:117] "RemoveContainer" containerID="905cba252ef494cc4175cfdab026997cd80d0bb3840a32d43f79e5c27a2d8145" Oct 03 15:39:49 crc kubenswrapper[4774]: E1003 15:39:49.355917 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"905cba252ef494cc4175cfdab026997cd80d0bb3840a32d43f79e5c27a2d8145\": container with ID starting with 905cba252ef494cc4175cfdab026997cd80d0bb3840a32d43f79e5c27a2d8145 not found: ID does not exist" containerID="905cba252ef494cc4175cfdab026997cd80d0bb3840a32d43f79e5c27a2d8145" Oct 03 15:39:49 crc kubenswrapper[4774]: I1003 15:39:49.355970 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"905cba252ef494cc4175cfdab026997cd80d0bb3840a32d43f79e5c27a2d8145"} err="failed to get container status \"905cba252ef494cc4175cfdab026997cd80d0bb3840a32d43f79e5c27a2d8145\": rpc error: code = NotFound desc = could not find container \"905cba252ef494cc4175cfdab026997cd80d0bb3840a32d43f79e5c27a2d8145\": container with ID starting with 905cba252ef494cc4175cfdab026997cd80d0bb3840a32d43f79e5c27a2d8145 not found: ID does not exist" Oct 03 15:39:49 crc kubenswrapper[4774]: I1003 15:39:49.356002 4774 scope.go:117] "RemoveContainer" containerID="f6ebd6f79a49c66106e3e6f7e46476d2fa62418026d5854c8323da1b4bc7a43d" Oct 03 15:39:49 crc kubenswrapper[4774]: E1003 15:39:49.356435 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6ebd6f79a49c66106e3e6f7e46476d2fa62418026d5854c8323da1b4bc7a43d\": container with ID starting with f6ebd6f79a49c66106e3e6f7e46476d2fa62418026d5854c8323da1b4bc7a43d not found: ID does not exist" containerID="f6ebd6f79a49c66106e3e6f7e46476d2fa62418026d5854c8323da1b4bc7a43d" Oct 03 15:39:49 crc kubenswrapper[4774]: I1003 15:39:49.356521 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6ebd6f79a49c66106e3e6f7e46476d2fa62418026d5854c8323da1b4bc7a43d"} err="failed to get container status \"f6ebd6f79a49c66106e3e6f7e46476d2fa62418026d5854c8323da1b4bc7a43d\": rpc error: code = NotFound desc = could not find container \"f6ebd6f79a49c66106e3e6f7e46476d2fa62418026d5854c8323da1b4bc7a43d\": container with ID starting with f6ebd6f79a49c66106e3e6f7e46476d2fa62418026d5854c8323da1b4bc7a43d not found: ID does not exist" Oct 03 15:39:49 crc kubenswrapper[4774]: I1003 15:39:49.356558 4774 scope.go:117] "RemoveContainer" containerID="540cc7c0c6b62c3f58131a0ebd058ff27d3c08540ff40993f6a6b9534a1b957b" Oct 03 15:39:49 crc kubenswrapper[4774]: E1003 15:39:49.356896 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"540cc7c0c6b62c3f58131a0ebd058ff27d3c08540ff40993f6a6b9534a1b957b\": container with ID starting with 540cc7c0c6b62c3f58131a0ebd058ff27d3c08540ff40993f6a6b9534a1b957b not found: ID does not exist" containerID="540cc7c0c6b62c3f58131a0ebd058ff27d3c08540ff40993f6a6b9534a1b957b" Oct 03 15:39:49 crc kubenswrapper[4774]: I1003 15:39:49.356950 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"540cc7c0c6b62c3f58131a0ebd058ff27d3c08540ff40993f6a6b9534a1b957b"} err="failed to get container status \"540cc7c0c6b62c3f58131a0ebd058ff27d3c08540ff40993f6a6b9534a1b957b\": rpc error: code = NotFound desc = could not find container \"540cc7c0c6b62c3f58131a0ebd058ff27d3c08540ff40993f6a6b9534a1b957b\": container with ID starting with 540cc7c0c6b62c3f58131a0ebd058ff27d3c08540ff40993f6a6b9534a1b957b not found: ID does not exist" Oct 03 15:39:51 crc kubenswrapper[4774]: I1003 15:39:51.313314 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23ff1d5e-ccb1-4705-9d02-a2ba1359c04b" path="/var/lib/kubelet/pods/23ff1d5e-ccb1-4705-9d02-a2ba1359c04b/volumes" Oct 03 15:40:50 crc kubenswrapper[4774]: I1003 15:40:50.705161 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:40:50 crc kubenswrapper[4774]: I1003 15:40:50.705755 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:41:07 crc kubenswrapper[4774]: I1003 15:41:07.873762 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kwjwh"] Oct 03 15:41:07 crc kubenswrapper[4774]: E1003 15:41:07.874877 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23ff1d5e-ccb1-4705-9d02-a2ba1359c04b" containerName="extract-content" Oct 03 15:41:07 crc kubenswrapper[4774]: I1003 15:41:07.874892 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="23ff1d5e-ccb1-4705-9d02-a2ba1359c04b" containerName="extract-content" Oct 03 15:41:07 crc kubenswrapper[4774]: E1003 15:41:07.874905 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23ff1d5e-ccb1-4705-9d02-a2ba1359c04b" containerName="extract-utilities" Oct 03 15:41:07 crc kubenswrapper[4774]: I1003 15:41:07.874912 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="23ff1d5e-ccb1-4705-9d02-a2ba1359c04b" containerName="extract-utilities" Oct 03 15:41:07 crc kubenswrapper[4774]: E1003 15:41:07.874924 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77726354-17d3-41d4-b1c8-5768a06024fb" containerName="extract-utilities" Oct 03 15:41:07 crc kubenswrapper[4774]: I1003 15:41:07.874931 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="77726354-17d3-41d4-b1c8-5768a06024fb" containerName="extract-utilities" Oct 03 15:41:07 crc kubenswrapper[4774]: E1003 15:41:07.874942 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23ff1d5e-ccb1-4705-9d02-a2ba1359c04b" containerName="registry-server" Oct 03 15:41:07 crc kubenswrapper[4774]: I1003 15:41:07.874947 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="23ff1d5e-ccb1-4705-9d02-a2ba1359c04b" containerName="registry-server" Oct 03 15:41:07 crc kubenswrapper[4774]: E1003 15:41:07.874964 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77726354-17d3-41d4-b1c8-5768a06024fb" containerName="registry-server" Oct 03 15:41:07 crc kubenswrapper[4774]: I1003 15:41:07.874970 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="77726354-17d3-41d4-b1c8-5768a06024fb" containerName="registry-server" Oct 03 15:41:07 crc kubenswrapper[4774]: E1003 15:41:07.874977 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77726354-17d3-41d4-b1c8-5768a06024fb" containerName="extract-content" Oct 03 15:41:07 crc kubenswrapper[4774]: I1003 15:41:07.874983 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="77726354-17d3-41d4-b1c8-5768a06024fb" containerName="extract-content" Oct 03 15:41:07 crc kubenswrapper[4774]: I1003 15:41:07.875161 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="77726354-17d3-41d4-b1c8-5768a06024fb" containerName="registry-server" Oct 03 15:41:07 crc kubenswrapper[4774]: I1003 15:41:07.875173 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="23ff1d5e-ccb1-4705-9d02-a2ba1359c04b" containerName="registry-server" Oct 03 15:41:07 crc kubenswrapper[4774]: I1003 15:41:07.878964 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kwjwh" Oct 03 15:41:07 crc kubenswrapper[4774]: I1003 15:41:07.885908 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwjwh"] Oct 03 15:41:08 crc kubenswrapper[4774]: I1003 15:41:08.065234 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba9992a1-be7e-403d-8b10-7ba5ad20c3b0-catalog-content\") pod \"redhat-marketplace-kwjwh\" (UID: \"ba9992a1-be7e-403d-8b10-7ba5ad20c3b0\") " pod="openshift-marketplace/redhat-marketplace-kwjwh" Oct 03 15:41:08 crc kubenswrapper[4774]: I1003 15:41:08.065539 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba9992a1-be7e-403d-8b10-7ba5ad20c3b0-utilities\") pod \"redhat-marketplace-kwjwh\" (UID: \"ba9992a1-be7e-403d-8b10-7ba5ad20c3b0\") " pod="openshift-marketplace/redhat-marketplace-kwjwh" Oct 03 15:41:08 crc kubenswrapper[4774]: I1003 15:41:08.065585 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lggcn\" (UniqueName: \"kubernetes.io/projected/ba9992a1-be7e-403d-8b10-7ba5ad20c3b0-kube-api-access-lggcn\") pod \"redhat-marketplace-kwjwh\" (UID: \"ba9992a1-be7e-403d-8b10-7ba5ad20c3b0\") " pod="openshift-marketplace/redhat-marketplace-kwjwh" Oct 03 15:41:08 crc kubenswrapper[4774]: I1003 15:41:08.167401 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba9992a1-be7e-403d-8b10-7ba5ad20c3b0-catalog-content\") pod \"redhat-marketplace-kwjwh\" (UID: \"ba9992a1-be7e-403d-8b10-7ba5ad20c3b0\") " pod="openshift-marketplace/redhat-marketplace-kwjwh" Oct 03 15:41:08 crc kubenswrapper[4774]: I1003 15:41:08.167454 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba9992a1-be7e-403d-8b10-7ba5ad20c3b0-utilities\") pod \"redhat-marketplace-kwjwh\" (UID: \"ba9992a1-be7e-403d-8b10-7ba5ad20c3b0\") " pod="openshift-marketplace/redhat-marketplace-kwjwh" Oct 03 15:41:08 crc kubenswrapper[4774]: I1003 15:41:08.167503 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lggcn\" (UniqueName: \"kubernetes.io/projected/ba9992a1-be7e-403d-8b10-7ba5ad20c3b0-kube-api-access-lggcn\") pod \"redhat-marketplace-kwjwh\" (UID: \"ba9992a1-be7e-403d-8b10-7ba5ad20c3b0\") " pod="openshift-marketplace/redhat-marketplace-kwjwh" Oct 03 15:41:08 crc kubenswrapper[4774]: I1003 15:41:08.168179 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba9992a1-be7e-403d-8b10-7ba5ad20c3b0-catalog-content\") pod \"redhat-marketplace-kwjwh\" (UID: \"ba9992a1-be7e-403d-8b10-7ba5ad20c3b0\") " pod="openshift-marketplace/redhat-marketplace-kwjwh" Oct 03 15:41:08 crc kubenswrapper[4774]: I1003 15:41:08.168472 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba9992a1-be7e-403d-8b10-7ba5ad20c3b0-utilities\") pod \"redhat-marketplace-kwjwh\" (UID: \"ba9992a1-be7e-403d-8b10-7ba5ad20c3b0\") " pod="openshift-marketplace/redhat-marketplace-kwjwh" Oct 03 15:41:08 crc kubenswrapper[4774]: I1003 15:41:08.186146 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lggcn\" (UniqueName: \"kubernetes.io/projected/ba9992a1-be7e-403d-8b10-7ba5ad20c3b0-kube-api-access-lggcn\") pod \"redhat-marketplace-kwjwh\" (UID: \"ba9992a1-be7e-403d-8b10-7ba5ad20c3b0\") " pod="openshift-marketplace/redhat-marketplace-kwjwh" Oct 03 15:41:08 crc kubenswrapper[4774]: I1003 15:41:08.205259 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kwjwh" Oct 03 15:41:08 crc kubenswrapper[4774]: I1003 15:41:08.673953 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwjwh"] Oct 03 15:41:08 crc kubenswrapper[4774]: W1003 15:41:08.680889 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba9992a1_be7e_403d_8b10_7ba5ad20c3b0.slice/crio-76a8b4642db06465d395952d51f71a0ad6da5a6fb3560d3efe4a75a2fd5a0b61 WatchSource:0}: Error finding container 76a8b4642db06465d395952d51f71a0ad6da5a6fb3560d3efe4a75a2fd5a0b61: Status 404 returned error can't find the container with id 76a8b4642db06465d395952d51f71a0ad6da5a6fb3560d3efe4a75a2fd5a0b61 Oct 03 15:41:09 crc kubenswrapper[4774]: I1003 15:41:09.074878 4774 generic.go:334] "Generic (PLEG): container finished" podID="ba9992a1-be7e-403d-8b10-7ba5ad20c3b0" containerID="32fda2029b3962821a82180dab293384470a17b8cfaa79b1b76a7bc2be0be2dd" exitCode=0 Oct 03 15:41:09 crc kubenswrapper[4774]: I1003 15:41:09.074963 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwjwh" event={"ID":"ba9992a1-be7e-403d-8b10-7ba5ad20c3b0","Type":"ContainerDied","Data":"32fda2029b3962821a82180dab293384470a17b8cfaa79b1b76a7bc2be0be2dd"} Oct 03 15:41:09 crc kubenswrapper[4774]: I1003 15:41:09.076152 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwjwh" event={"ID":"ba9992a1-be7e-403d-8b10-7ba5ad20c3b0","Type":"ContainerStarted","Data":"76a8b4642db06465d395952d51f71a0ad6da5a6fb3560d3efe4a75a2fd5a0b61"} Oct 03 15:41:11 crc kubenswrapper[4774]: I1003 15:41:11.105699 4774 generic.go:334] "Generic (PLEG): container finished" podID="ba9992a1-be7e-403d-8b10-7ba5ad20c3b0" containerID="2620116a6ffc06fb09988092d91d125eecd8f7ebc3dc352b715765eb2e0f35e8" exitCode=0 Oct 03 15:41:11 crc kubenswrapper[4774]: I1003 15:41:11.105825 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwjwh" event={"ID":"ba9992a1-be7e-403d-8b10-7ba5ad20c3b0","Type":"ContainerDied","Data":"2620116a6ffc06fb09988092d91d125eecd8f7ebc3dc352b715765eb2e0f35e8"} Oct 03 15:41:12 crc kubenswrapper[4774]: I1003 15:41:12.120754 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwjwh" event={"ID":"ba9992a1-be7e-403d-8b10-7ba5ad20c3b0","Type":"ContainerStarted","Data":"3550a68110e0352df41eb7f2a747d7edfa2449d4bceb0fd0e508378e70e848ab"} Oct 03 15:41:12 crc kubenswrapper[4774]: I1003 15:41:12.154232 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kwjwh" podStartSLOduration=2.5493441629999998 podStartE2EDuration="5.154213928s" podCreationTimestamp="2025-10-03 15:41:07 +0000 UTC" firstStartedPulling="2025-10-03 15:41:09.077123002 +0000 UTC m=+3491.666326454" lastFinishedPulling="2025-10-03 15:41:11.681992727 +0000 UTC m=+3494.271196219" observedRunningTime="2025-10-03 15:41:12.144869335 +0000 UTC m=+3494.734072787" watchObservedRunningTime="2025-10-03 15:41:12.154213928 +0000 UTC m=+3494.743417380" Oct 03 15:41:18 crc kubenswrapper[4774]: I1003 15:41:18.205952 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kwjwh" Oct 03 15:41:18 crc kubenswrapper[4774]: I1003 15:41:18.206816 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kwjwh" Oct 03 15:41:18 crc kubenswrapper[4774]: I1003 15:41:18.286787 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kwjwh" Oct 03 15:41:19 crc kubenswrapper[4774]: I1003 15:41:19.257887 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kwjwh" Oct 03 15:41:19 crc kubenswrapper[4774]: I1003 15:41:19.643203 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwjwh"] Oct 03 15:41:20 crc kubenswrapper[4774]: I1003 15:41:20.654210 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:41:20 crc kubenswrapper[4774]: I1003 15:41:20.654964 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:41:21 crc kubenswrapper[4774]: I1003 15:41:21.221462 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kwjwh" podUID="ba9992a1-be7e-403d-8b10-7ba5ad20c3b0" containerName="registry-server" containerID="cri-o://3550a68110e0352df41eb7f2a747d7edfa2449d4bceb0fd0e508378e70e848ab" gracePeriod=2 Oct 03 15:41:21 crc kubenswrapper[4774]: I1003 15:41:21.745977 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kwjwh" Oct 03 15:41:21 crc kubenswrapper[4774]: I1003 15:41:21.859716 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lggcn\" (UniqueName: \"kubernetes.io/projected/ba9992a1-be7e-403d-8b10-7ba5ad20c3b0-kube-api-access-lggcn\") pod \"ba9992a1-be7e-403d-8b10-7ba5ad20c3b0\" (UID: \"ba9992a1-be7e-403d-8b10-7ba5ad20c3b0\") " Oct 03 15:41:21 crc kubenswrapper[4774]: I1003 15:41:21.859777 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba9992a1-be7e-403d-8b10-7ba5ad20c3b0-utilities\") pod \"ba9992a1-be7e-403d-8b10-7ba5ad20c3b0\" (UID: \"ba9992a1-be7e-403d-8b10-7ba5ad20c3b0\") " Oct 03 15:41:21 crc kubenswrapper[4774]: I1003 15:41:21.859838 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba9992a1-be7e-403d-8b10-7ba5ad20c3b0-catalog-content\") pod \"ba9992a1-be7e-403d-8b10-7ba5ad20c3b0\" (UID: \"ba9992a1-be7e-403d-8b10-7ba5ad20c3b0\") " Oct 03 15:41:21 crc kubenswrapper[4774]: I1003 15:41:21.861717 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba9992a1-be7e-403d-8b10-7ba5ad20c3b0-utilities" (OuterVolumeSpecName: "utilities") pod "ba9992a1-be7e-403d-8b10-7ba5ad20c3b0" (UID: "ba9992a1-be7e-403d-8b10-7ba5ad20c3b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:41:21 crc kubenswrapper[4774]: I1003 15:41:21.867002 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba9992a1-be7e-403d-8b10-7ba5ad20c3b0-kube-api-access-lggcn" (OuterVolumeSpecName: "kube-api-access-lggcn") pod "ba9992a1-be7e-403d-8b10-7ba5ad20c3b0" (UID: "ba9992a1-be7e-403d-8b10-7ba5ad20c3b0"). InnerVolumeSpecName "kube-api-access-lggcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:41:21 crc kubenswrapper[4774]: I1003 15:41:21.876870 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba9992a1-be7e-403d-8b10-7ba5ad20c3b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ba9992a1-be7e-403d-8b10-7ba5ad20c3b0" (UID: "ba9992a1-be7e-403d-8b10-7ba5ad20c3b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:41:21 crc kubenswrapper[4774]: I1003 15:41:21.962311 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lggcn\" (UniqueName: \"kubernetes.io/projected/ba9992a1-be7e-403d-8b10-7ba5ad20c3b0-kube-api-access-lggcn\") on node \"crc\" DevicePath \"\"" Oct 03 15:41:21 crc kubenswrapper[4774]: I1003 15:41:21.962352 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba9992a1-be7e-403d-8b10-7ba5ad20c3b0-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 15:41:21 crc kubenswrapper[4774]: I1003 15:41:21.962367 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba9992a1-be7e-403d-8b10-7ba5ad20c3b0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 15:41:22 crc kubenswrapper[4774]: I1003 15:41:22.241146 4774 generic.go:334] "Generic (PLEG): container finished" podID="ba9992a1-be7e-403d-8b10-7ba5ad20c3b0" containerID="3550a68110e0352df41eb7f2a747d7edfa2449d4bceb0fd0e508378e70e848ab" exitCode=0 Oct 03 15:41:22 crc kubenswrapper[4774]: I1003 15:41:22.241281 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kwjwh" Oct 03 15:41:22 crc kubenswrapper[4774]: I1003 15:41:22.241271 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwjwh" event={"ID":"ba9992a1-be7e-403d-8b10-7ba5ad20c3b0","Type":"ContainerDied","Data":"3550a68110e0352df41eb7f2a747d7edfa2449d4bceb0fd0e508378e70e848ab"} Oct 03 15:41:22 crc kubenswrapper[4774]: I1003 15:41:22.241679 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwjwh" event={"ID":"ba9992a1-be7e-403d-8b10-7ba5ad20c3b0","Type":"ContainerDied","Data":"76a8b4642db06465d395952d51f71a0ad6da5a6fb3560d3efe4a75a2fd5a0b61"} Oct 03 15:41:22 crc kubenswrapper[4774]: I1003 15:41:22.241714 4774 scope.go:117] "RemoveContainer" containerID="3550a68110e0352df41eb7f2a747d7edfa2449d4bceb0fd0e508378e70e848ab" Oct 03 15:41:22 crc kubenswrapper[4774]: I1003 15:41:22.288292 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwjwh"] Oct 03 15:41:22 crc kubenswrapper[4774]: I1003 15:41:22.288457 4774 scope.go:117] "RemoveContainer" containerID="2620116a6ffc06fb09988092d91d125eecd8f7ebc3dc352b715765eb2e0f35e8" Oct 03 15:41:22 crc kubenswrapper[4774]: I1003 15:41:22.300492 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwjwh"] Oct 03 15:41:22 crc kubenswrapper[4774]: I1003 15:41:22.321904 4774 scope.go:117] "RemoveContainer" containerID="32fda2029b3962821a82180dab293384470a17b8cfaa79b1b76a7bc2be0be2dd" Oct 03 15:41:22 crc kubenswrapper[4774]: I1003 15:41:22.368687 4774 scope.go:117] "RemoveContainer" containerID="3550a68110e0352df41eb7f2a747d7edfa2449d4bceb0fd0e508378e70e848ab" Oct 03 15:41:22 crc kubenswrapper[4774]: E1003 15:41:22.369363 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3550a68110e0352df41eb7f2a747d7edfa2449d4bceb0fd0e508378e70e848ab\": container with ID starting with 3550a68110e0352df41eb7f2a747d7edfa2449d4bceb0fd0e508378e70e848ab not found: ID does not exist" containerID="3550a68110e0352df41eb7f2a747d7edfa2449d4bceb0fd0e508378e70e848ab" Oct 03 15:41:22 crc kubenswrapper[4774]: I1003 15:41:22.369429 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3550a68110e0352df41eb7f2a747d7edfa2449d4bceb0fd0e508378e70e848ab"} err="failed to get container status \"3550a68110e0352df41eb7f2a747d7edfa2449d4bceb0fd0e508378e70e848ab\": rpc error: code = NotFound desc = could not find container \"3550a68110e0352df41eb7f2a747d7edfa2449d4bceb0fd0e508378e70e848ab\": container with ID starting with 3550a68110e0352df41eb7f2a747d7edfa2449d4bceb0fd0e508378e70e848ab not found: ID does not exist" Oct 03 15:41:22 crc kubenswrapper[4774]: I1003 15:41:22.369464 4774 scope.go:117] "RemoveContainer" containerID="2620116a6ffc06fb09988092d91d125eecd8f7ebc3dc352b715765eb2e0f35e8" Oct 03 15:41:22 crc kubenswrapper[4774]: E1003 15:41:22.369946 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2620116a6ffc06fb09988092d91d125eecd8f7ebc3dc352b715765eb2e0f35e8\": container with ID starting with 2620116a6ffc06fb09988092d91d125eecd8f7ebc3dc352b715765eb2e0f35e8 not found: ID does not exist" containerID="2620116a6ffc06fb09988092d91d125eecd8f7ebc3dc352b715765eb2e0f35e8" Oct 03 15:41:22 crc kubenswrapper[4774]: I1003 15:41:22.369985 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2620116a6ffc06fb09988092d91d125eecd8f7ebc3dc352b715765eb2e0f35e8"} err="failed to get container status \"2620116a6ffc06fb09988092d91d125eecd8f7ebc3dc352b715765eb2e0f35e8\": rpc error: code = NotFound desc = could not find container \"2620116a6ffc06fb09988092d91d125eecd8f7ebc3dc352b715765eb2e0f35e8\": container with ID starting with 2620116a6ffc06fb09988092d91d125eecd8f7ebc3dc352b715765eb2e0f35e8 not found: ID does not exist" Oct 03 15:41:22 crc kubenswrapper[4774]: I1003 15:41:22.370014 4774 scope.go:117] "RemoveContainer" containerID="32fda2029b3962821a82180dab293384470a17b8cfaa79b1b76a7bc2be0be2dd" Oct 03 15:41:22 crc kubenswrapper[4774]: E1003 15:41:22.370411 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32fda2029b3962821a82180dab293384470a17b8cfaa79b1b76a7bc2be0be2dd\": container with ID starting with 32fda2029b3962821a82180dab293384470a17b8cfaa79b1b76a7bc2be0be2dd not found: ID does not exist" containerID="32fda2029b3962821a82180dab293384470a17b8cfaa79b1b76a7bc2be0be2dd" Oct 03 15:41:22 crc kubenswrapper[4774]: I1003 15:41:22.370474 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32fda2029b3962821a82180dab293384470a17b8cfaa79b1b76a7bc2be0be2dd"} err="failed to get container status \"32fda2029b3962821a82180dab293384470a17b8cfaa79b1b76a7bc2be0be2dd\": rpc error: code = NotFound desc = could not find container \"32fda2029b3962821a82180dab293384470a17b8cfaa79b1b76a7bc2be0be2dd\": container with ID starting with 32fda2029b3962821a82180dab293384470a17b8cfaa79b1b76a7bc2be0be2dd not found: ID does not exist" Oct 03 15:41:23 crc kubenswrapper[4774]: I1003 15:41:23.320574 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba9992a1-be7e-403d-8b10-7ba5ad20c3b0" path="/var/lib/kubelet/pods/ba9992a1-be7e-403d-8b10-7ba5ad20c3b0/volumes" Oct 03 15:41:50 crc kubenswrapper[4774]: I1003 15:41:50.654506 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:41:50 crc kubenswrapper[4774]: I1003 15:41:50.655345 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:41:50 crc kubenswrapper[4774]: I1003 15:41:50.655472 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" Oct 03 15:41:50 crc kubenswrapper[4774]: I1003 15:41:50.656903 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f2ac2522d841de5c1ea1ba89a06aceeed8a354385c6834a910301458a95a43d3"} pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 15:41:50 crc kubenswrapper[4774]: I1003 15:41:50.657035 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" containerID="cri-o://f2ac2522d841de5c1ea1ba89a06aceeed8a354385c6834a910301458a95a43d3" gracePeriod=600 Oct 03 15:41:51 crc kubenswrapper[4774]: I1003 15:41:51.572526 4774 generic.go:334] "Generic (PLEG): container finished" podID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerID="f2ac2522d841de5c1ea1ba89a06aceeed8a354385c6834a910301458a95a43d3" exitCode=0 Oct 03 15:41:51 crc kubenswrapper[4774]: I1003 15:41:51.572606 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" event={"ID":"ca37ac4b-f421-4198-a179-12901d36f0f5","Type":"ContainerDied","Data":"f2ac2522d841de5c1ea1ba89a06aceeed8a354385c6834a910301458a95a43d3"} Oct 03 15:41:51 crc kubenswrapper[4774]: I1003 15:41:51.573033 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" event={"ID":"ca37ac4b-f421-4198-a179-12901d36f0f5","Type":"ContainerStarted","Data":"6a9aac6dbc322b9090f9a59162378fc7732101d04a69222bfa4d228837707411"} Oct 03 15:41:51 crc kubenswrapper[4774]: I1003 15:41:51.573058 4774 scope.go:117] "RemoveContainer" containerID="cfea26b55e1d9b962f87ac922eb7772356d541492b505a24f7ff4d97f8d38ef7" Oct 03 15:42:07 crc kubenswrapper[4774]: I1003 15:42:07.746147 4774 generic.go:334] "Generic (PLEG): container finished" podID="7b63f9fa-2194-46e4-bfe0-d7efb33f10fb" containerID="723acbbb0d8164532a2cf396d95d9bc57eef75d84b4a4a86a082d9e9ca3ffb6b" exitCode=0 Oct 03 15:42:07 crc kubenswrapper[4774]: I1003 15:42:07.746249 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb","Type":"ContainerDied","Data":"723acbbb0d8164532a2cf396d95d9bc57eef75d84b4a4a86a082d9e9ca3ffb6b"} Oct 03 15:42:09 crc kubenswrapper[4774]: I1003 15:42:09.107158 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 03 15:42:09 crc kubenswrapper[4774]: I1003 15:42:09.168557 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " Oct 03 15:42:09 crc kubenswrapper[4774]: I1003 15:42:09.168621 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-openstack-config\") pod \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " Oct 03 15:42:09 crc kubenswrapper[4774]: I1003 15:42:09.168683 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqw5h\" (UniqueName: \"kubernetes.io/projected/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-kube-api-access-bqw5h\") pod \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " Oct 03 15:42:09 crc kubenswrapper[4774]: I1003 15:42:09.168704 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-openstack-config-secret\") pod \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " Oct 03 15:42:09 crc kubenswrapper[4774]: I1003 15:42:09.168723 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-ssh-key\") pod \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " Oct 03 15:42:09 crc kubenswrapper[4774]: I1003 15:42:09.168747 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-test-operator-ephemeral-temporary\") pod \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " Oct 03 15:42:09 crc kubenswrapper[4774]: I1003 15:42:09.168768 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-ca-certs\") pod \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " Oct 03 15:42:09 crc kubenswrapper[4774]: I1003 15:42:09.168806 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-test-operator-ephemeral-workdir\") pod \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " Oct 03 15:42:09 crc kubenswrapper[4774]: I1003 15:42:09.168869 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-config-data\") pod \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\" (UID: \"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb\") " Oct 03 15:42:09 crc kubenswrapper[4774]: I1003 15:42:09.170080 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "7b63f9fa-2194-46e4-bfe0-d7efb33f10fb" (UID: "7b63f9fa-2194-46e4-bfe0-d7efb33f10fb"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:42:09 crc kubenswrapper[4774]: I1003 15:42:09.170227 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-config-data" (OuterVolumeSpecName: "config-data") pod "7b63f9fa-2194-46e4-bfe0-d7efb33f10fb" (UID: "7b63f9fa-2194-46e4-bfe0-d7efb33f10fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:42:09 crc kubenswrapper[4774]: I1003 15:42:09.174350 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-kube-api-access-bqw5h" (OuterVolumeSpecName: "kube-api-access-bqw5h") pod "7b63f9fa-2194-46e4-bfe0-d7efb33f10fb" (UID: "7b63f9fa-2194-46e4-bfe0-d7efb33f10fb"). InnerVolumeSpecName "kube-api-access-bqw5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:42:09 crc kubenswrapper[4774]: I1003 15:42:09.174723 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "7b63f9fa-2194-46e4-bfe0-d7efb33f10fb" (UID: "7b63f9fa-2194-46e4-bfe0-d7efb33f10fb"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:42:09 crc kubenswrapper[4774]: I1003 15:42:09.174970 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "test-operator-logs") pod "7b63f9fa-2194-46e4-bfe0-d7efb33f10fb" (UID: "7b63f9fa-2194-46e4-bfe0-d7efb33f10fb"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 15:42:09 crc kubenswrapper[4774]: I1003 15:42:09.195848 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "7b63f9fa-2194-46e4-bfe0-d7efb33f10fb" (UID: "7b63f9fa-2194-46e4-bfe0-d7efb33f10fb"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:42:09 crc kubenswrapper[4774]: I1003 15:42:09.203551 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7b63f9fa-2194-46e4-bfe0-d7efb33f10fb" (UID: "7b63f9fa-2194-46e4-bfe0-d7efb33f10fb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:42:09 crc kubenswrapper[4774]: I1003 15:42:09.214579 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "7b63f9fa-2194-46e4-bfe0-d7efb33f10fb" (UID: "7b63f9fa-2194-46e4-bfe0-d7efb33f10fb"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:42:09 crc kubenswrapper[4774]: I1003 15:42:09.231992 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "7b63f9fa-2194-46e4-bfe0-d7efb33f10fb" (UID: "7b63f9fa-2194-46e4-bfe0-d7efb33f10fb"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:42:09 crc kubenswrapper[4774]: I1003 15:42:09.270797 4774 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 03 15:42:09 crc kubenswrapper[4774]: I1003 15:42:09.271651 4774 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 03 15:42:09 crc kubenswrapper[4774]: I1003 15:42:09.271818 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:42:09 crc kubenswrapper[4774]: I1003 15:42:09.271993 4774 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Oct 03 15:42:09 crc kubenswrapper[4774]: I1003 15:42:09.272152 4774 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 03 15:42:09 crc kubenswrapper[4774]: I1003 15:42:09.272300 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqw5h\" (UniqueName: \"kubernetes.io/projected/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-kube-api-access-bqw5h\") on node \"crc\" DevicePath \"\"" Oct 03 15:42:09 crc kubenswrapper[4774]: I1003 15:42:09.272471 4774 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 03 15:42:09 crc kubenswrapper[4774]: I1003 15:42:09.272617 4774 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 15:42:09 crc kubenswrapper[4774]: I1003 15:42:09.272754 4774 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/7b63f9fa-2194-46e4-bfe0-d7efb33f10fb-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 03 15:42:09 crc kubenswrapper[4774]: I1003 15:42:09.302934 4774 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Oct 03 15:42:09 crc kubenswrapper[4774]: I1003 15:42:09.375665 4774 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Oct 03 15:42:09 crc kubenswrapper[4774]: I1003 15:42:09.769783 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"7b63f9fa-2194-46e4-bfe0-d7efb33f10fb","Type":"ContainerDied","Data":"9ef358004b1c1acc8e28ad9a0e97a6fe55caaf6a859f8edeb9cc25f78073ded5"} Oct 03 15:42:09 crc kubenswrapper[4774]: I1003 15:42:09.770082 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ef358004b1c1acc8e28ad9a0e97a6fe55caaf6a859f8edeb9cc25f78073ded5" Oct 03 15:42:09 crc kubenswrapper[4774]: I1003 15:42:09.769889 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 03 15:42:14 crc kubenswrapper[4774]: I1003 15:42:14.014517 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 03 15:42:14 crc kubenswrapper[4774]: E1003 15:42:14.015599 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba9992a1-be7e-403d-8b10-7ba5ad20c3b0" containerName="registry-server" Oct 03 15:42:14 crc kubenswrapper[4774]: I1003 15:42:14.015613 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba9992a1-be7e-403d-8b10-7ba5ad20c3b0" containerName="registry-server" Oct 03 15:42:14 crc kubenswrapper[4774]: E1003 15:42:14.015634 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba9992a1-be7e-403d-8b10-7ba5ad20c3b0" containerName="extract-utilities" Oct 03 15:42:14 crc kubenswrapper[4774]: I1003 15:42:14.015643 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba9992a1-be7e-403d-8b10-7ba5ad20c3b0" containerName="extract-utilities" Oct 03 15:42:14 crc kubenswrapper[4774]: E1003 15:42:14.015670 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b63f9fa-2194-46e4-bfe0-d7efb33f10fb" containerName="tempest-tests-tempest-tests-runner" Oct 03 15:42:14 crc kubenswrapper[4774]: I1003 15:42:14.015677 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b63f9fa-2194-46e4-bfe0-d7efb33f10fb" containerName="tempest-tests-tempest-tests-runner" Oct 03 15:42:14 crc kubenswrapper[4774]: E1003 15:42:14.015691 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba9992a1-be7e-403d-8b10-7ba5ad20c3b0" containerName="extract-content" Oct 03 15:42:14 crc kubenswrapper[4774]: I1003 15:42:14.015700 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba9992a1-be7e-403d-8b10-7ba5ad20c3b0" containerName="extract-content" Oct 03 15:42:14 crc kubenswrapper[4774]: I1003 15:42:14.015873 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b63f9fa-2194-46e4-bfe0-d7efb33f10fb" containerName="tempest-tests-tempest-tests-runner" Oct 03 15:42:14 crc kubenswrapper[4774]: I1003 15:42:14.015894 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba9992a1-be7e-403d-8b10-7ba5ad20c3b0" containerName="registry-server" Oct 03 15:42:14 crc kubenswrapper[4774]: I1003 15:42:14.016634 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 15:42:14 crc kubenswrapper[4774]: I1003 15:42:14.019297 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-pk52n" Oct 03 15:42:14 crc kubenswrapper[4774]: I1003 15:42:14.022733 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 03 15:42:14 crc kubenswrapper[4774]: I1003 15:42:14.162953 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8lqh\" (UniqueName: \"kubernetes.io/projected/90552d32-4d94-4fc4-b843-60a78206b347-kube-api-access-h8lqh\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"90552d32-4d94-4fc4-b843-60a78206b347\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 15:42:14 crc kubenswrapper[4774]: I1003 15:42:14.163141 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"90552d32-4d94-4fc4-b843-60a78206b347\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 15:42:14 crc kubenswrapper[4774]: I1003 15:42:14.264993 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"90552d32-4d94-4fc4-b843-60a78206b347\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 15:42:14 crc kubenswrapper[4774]: I1003 15:42:14.265126 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8lqh\" (UniqueName: \"kubernetes.io/projected/90552d32-4d94-4fc4-b843-60a78206b347-kube-api-access-h8lqh\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"90552d32-4d94-4fc4-b843-60a78206b347\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 15:42:14 crc kubenswrapper[4774]: I1003 15:42:14.265827 4774 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"90552d32-4d94-4fc4-b843-60a78206b347\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 15:42:14 crc kubenswrapper[4774]: I1003 15:42:14.287405 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8lqh\" (UniqueName: \"kubernetes.io/projected/90552d32-4d94-4fc4-b843-60a78206b347-kube-api-access-h8lqh\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"90552d32-4d94-4fc4-b843-60a78206b347\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 15:42:14 crc kubenswrapper[4774]: I1003 15:42:14.310126 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"90552d32-4d94-4fc4-b843-60a78206b347\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 15:42:14 crc kubenswrapper[4774]: I1003 15:42:14.340613 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 15:42:14 crc kubenswrapper[4774]: I1003 15:42:14.794797 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 03 15:42:14 crc kubenswrapper[4774]: I1003 15:42:14.815986 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"90552d32-4d94-4fc4-b843-60a78206b347","Type":"ContainerStarted","Data":"f55c1095b614b2f8ee12d65ed4c1708f04d3121427dbed52a1cbaf734e3fc289"} Oct 03 15:42:16 crc kubenswrapper[4774]: I1003 15:42:16.839173 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"90552d32-4d94-4fc4-b843-60a78206b347","Type":"ContainerStarted","Data":"90ef25245f5eee2e90270529e278a013f703052e485b3f27601680479b1a3939"} Oct 03 15:42:16 crc kubenswrapper[4774]: I1003 15:42:16.874479 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.476352921 podStartE2EDuration="3.874452194s" podCreationTimestamp="2025-10-03 15:42:13 +0000 UTC" firstStartedPulling="2025-10-03 15:42:14.79877555 +0000 UTC m=+3557.387979012" lastFinishedPulling="2025-10-03 15:42:16.196874833 +0000 UTC m=+3558.786078285" observedRunningTime="2025-10-03 15:42:16.859567813 +0000 UTC m=+3559.448771335" watchObservedRunningTime="2025-10-03 15:42:16.874452194 +0000 UTC m=+3559.463655686" Oct 03 15:42:32 crc kubenswrapper[4774]: I1003 15:42:32.746357 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r575n/must-gather-7pk24"] Oct 03 15:42:32 crc kubenswrapper[4774]: I1003 15:42:32.748865 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r575n/must-gather-7pk24" Oct 03 15:42:32 crc kubenswrapper[4774]: I1003 15:42:32.752995 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-r575n"/"kube-root-ca.crt" Oct 03 15:42:32 crc kubenswrapper[4774]: I1003 15:42:32.753073 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-r575n"/"default-dockercfg-v7kkx" Oct 03 15:42:32 crc kubenswrapper[4774]: I1003 15:42:32.755138 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-r575n"/"openshift-service-ca.crt" Oct 03 15:42:32 crc kubenswrapper[4774]: I1003 15:42:32.761324 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r575n/must-gather-7pk24"] Oct 03 15:42:32 crc kubenswrapper[4774]: I1003 15:42:32.832650 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndgkc\" (UniqueName: \"kubernetes.io/projected/9153c48d-9951-41c0-b790-8f44aa0c8e77-kube-api-access-ndgkc\") pod \"must-gather-7pk24\" (UID: \"9153c48d-9951-41c0-b790-8f44aa0c8e77\") " pod="openshift-must-gather-r575n/must-gather-7pk24" Oct 03 15:42:32 crc kubenswrapper[4774]: I1003 15:42:32.832696 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9153c48d-9951-41c0-b790-8f44aa0c8e77-must-gather-output\") pod \"must-gather-7pk24\" (UID: \"9153c48d-9951-41c0-b790-8f44aa0c8e77\") " pod="openshift-must-gather-r575n/must-gather-7pk24" Oct 03 15:42:32 crc kubenswrapper[4774]: I1003 15:42:32.934339 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndgkc\" (UniqueName: \"kubernetes.io/projected/9153c48d-9951-41c0-b790-8f44aa0c8e77-kube-api-access-ndgkc\") pod \"must-gather-7pk24\" (UID: \"9153c48d-9951-41c0-b790-8f44aa0c8e77\") " pod="openshift-must-gather-r575n/must-gather-7pk24" Oct 03 15:42:32 crc kubenswrapper[4774]: I1003 15:42:32.934407 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9153c48d-9951-41c0-b790-8f44aa0c8e77-must-gather-output\") pod \"must-gather-7pk24\" (UID: \"9153c48d-9951-41c0-b790-8f44aa0c8e77\") " pod="openshift-must-gather-r575n/must-gather-7pk24" Oct 03 15:42:32 crc kubenswrapper[4774]: I1003 15:42:32.934962 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9153c48d-9951-41c0-b790-8f44aa0c8e77-must-gather-output\") pod \"must-gather-7pk24\" (UID: \"9153c48d-9951-41c0-b790-8f44aa0c8e77\") " pod="openshift-must-gather-r575n/must-gather-7pk24" Oct 03 15:42:32 crc kubenswrapper[4774]: I1003 15:42:32.953422 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndgkc\" (UniqueName: \"kubernetes.io/projected/9153c48d-9951-41c0-b790-8f44aa0c8e77-kube-api-access-ndgkc\") pod \"must-gather-7pk24\" (UID: \"9153c48d-9951-41c0-b790-8f44aa0c8e77\") " pod="openshift-must-gather-r575n/must-gather-7pk24" Oct 03 15:42:33 crc kubenswrapper[4774]: I1003 15:42:33.070774 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r575n/must-gather-7pk24" Oct 03 15:42:33 crc kubenswrapper[4774]: I1003 15:42:33.540013 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r575n/must-gather-7pk24"] Oct 03 15:42:33 crc kubenswrapper[4774]: I1003 15:42:33.548669 4774 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 15:42:34 crc kubenswrapper[4774]: I1003 15:42:34.020743 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r575n/must-gather-7pk24" event={"ID":"9153c48d-9951-41c0-b790-8f44aa0c8e77","Type":"ContainerStarted","Data":"6c604f86327051b084c95ce4dc89070ca7eb112c741eaada154b7a3b0a94c74f"} Oct 03 15:42:38 crc kubenswrapper[4774]: I1003 15:42:38.061440 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r575n/must-gather-7pk24" event={"ID":"9153c48d-9951-41c0-b790-8f44aa0c8e77","Type":"ContainerStarted","Data":"10eee216ac078c5911001b6a5c610126818015502066578145b170be85aab961"} Oct 03 15:42:38 crc kubenswrapper[4774]: I1003 15:42:38.061937 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r575n/must-gather-7pk24" event={"ID":"9153c48d-9951-41c0-b790-8f44aa0c8e77","Type":"ContainerStarted","Data":"464edeeac6284cf36415fec88fb62e4acdc0daa5bbc59f7677d45d09146e3778"} Oct 03 15:42:38 crc kubenswrapper[4774]: I1003 15:42:38.074501 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-r575n/must-gather-7pk24" podStartSLOduration=2.208600227 podStartE2EDuration="6.074480513s" podCreationTimestamp="2025-10-03 15:42:32 +0000 UTC" firstStartedPulling="2025-10-03 15:42:33.54844924 +0000 UTC m=+3576.137652692" lastFinishedPulling="2025-10-03 15:42:37.414329516 +0000 UTC m=+3580.003532978" observedRunningTime="2025-10-03 15:42:38.07275944 +0000 UTC m=+3580.661962902" watchObservedRunningTime="2025-10-03 15:42:38.074480513 +0000 UTC m=+3580.663683975" Oct 03 15:42:41 crc kubenswrapper[4774]: I1003 15:42:41.454118 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r575n/crc-debug-mpbnh"] Oct 03 15:42:41 crc kubenswrapper[4774]: I1003 15:42:41.457078 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r575n/crc-debug-mpbnh" Oct 03 15:42:41 crc kubenswrapper[4774]: I1003 15:42:41.614009 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82c95baa-90fd-473a-9e2f-03ed4c4356e8-host\") pod \"crc-debug-mpbnh\" (UID: \"82c95baa-90fd-473a-9e2f-03ed4c4356e8\") " pod="openshift-must-gather-r575n/crc-debug-mpbnh" Oct 03 15:42:41 crc kubenswrapper[4774]: I1003 15:42:41.614479 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpx6l\" (UniqueName: \"kubernetes.io/projected/82c95baa-90fd-473a-9e2f-03ed4c4356e8-kube-api-access-gpx6l\") pod \"crc-debug-mpbnh\" (UID: \"82c95baa-90fd-473a-9e2f-03ed4c4356e8\") " pod="openshift-must-gather-r575n/crc-debug-mpbnh" Oct 03 15:42:41 crc kubenswrapper[4774]: I1003 15:42:41.716001 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82c95baa-90fd-473a-9e2f-03ed4c4356e8-host\") pod \"crc-debug-mpbnh\" (UID: \"82c95baa-90fd-473a-9e2f-03ed4c4356e8\") " pod="openshift-must-gather-r575n/crc-debug-mpbnh" Oct 03 15:42:41 crc kubenswrapper[4774]: I1003 15:42:41.716094 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpx6l\" (UniqueName: \"kubernetes.io/projected/82c95baa-90fd-473a-9e2f-03ed4c4356e8-kube-api-access-gpx6l\") pod \"crc-debug-mpbnh\" (UID: \"82c95baa-90fd-473a-9e2f-03ed4c4356e8\") " pod="openshift-must-gather-r575n/crc-debug-mpbnh" Oct 03 15:42:41 crc kubenswrapper[4774]: I1003 15:42:41.716131 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82c95baa-90fd-473a-9e2f-03ed4c4356e8-host\") pod \"crc-debug-mpbnh\" (UID: \"82c95baa-90fd-473a-9e2f-03ed4c4356e8\") " pod="openshift-must-gather-r575n/crc-debug-mpbnh" Oct 03 15:42:41 crc kubenswrapper[4774]: I1003 15:42:41.772238 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpx6l\" (UniqueName: \"kubernetes.io/projected/82c95baa-90fd-473a-9e2f-03ed4c4356e8-kube-api-access-gpx6l\") pod \"crc-debug-mpbnh\" (UID: \"82c95baa-90fd-473a-9e2f-03ed4c4356e8\") " pod="openshift-must-gather-r575n/crc-debug-mpbnh" Oct 03 15:42:41 crc kubenswrapper[4774]: I1003 15:42:41.788156 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r575n/crc-debug-mpbnh" Oct 03 15:42:42 crc kubenswrapper[4774]: I1003 15:42:42.106965 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r575n/crc-debug-mpbnh" event={"ID":"82c95baa-90fd-473a-9e2f-03ed4c4356e8","Type":"ContainerStarted","Data":"05ff923091ba0d5ead6276d7d447c2fa68199b6ed838d8849af93c8686ad4cf1"} Oct 03 15:42:52 crc kubenswrapper[4774]: I1003 15:42:52.196004 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r575n/crc-debug-mpbnh" event={"ID":"82c95baa-90fd-473a-9e2f-03ed4c4356e8","Type":"ContainerStarted","Data":"3a42f66e98606607a8760dca189f553ab933febe9eec76a246d64b849d67b91b"} Oct 03 15:43:42 crc kubenswrapper[4774]: I1003 15:43:42.915166 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5587b8897d-cknc7_79e2926f-2fec-48fb-95d2-c3afcfec7c4c/barbican-api/0.log" Oct 03 15:43:43 crc kubenswrapper[4774]: I1003 15:43:43.020648 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5587b8897d-cknc7_79e2926f-2fec-48fb-95d2-c3afcfec7c4c/barbican-api-log/0.log" Oct 03 15:43:43 crc kubenswrapper[4774]: I1003 15:43:43.145538 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5b679cb88b-5lrzw_3ceaeb2a-3322-44b9-88ae-5c473721a68f/barbican-keystone-listener/0.log" Oct 03 15:43:43 crc kubenswrapper[4774]: I1003 15:43:43.273040 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5b679cb88b-5lrzw_3ceaeb2a-3322-44b9-88ae-5c473721a68f/barbican-keystone-listener-log/0.log" Oct 03 15:43:43 crc kubenswrapper[4774]: I1003 15:43:43.378648 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-74d9f6d95f-l289b_04d07033-b1e8-426d-828b-e78cb0f44294/barbican-worker/0.log" Oct 03 15:43:43 crc kubenswrapper[4774]: I1003 15:43:43.524547 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-74d9f6d95f-l289b_04d07033-b1e8-426d-828b-e78cb0f44294/barbican-worker-log/0.log" Oct 03 15:43:43 crc kubenswrapper[4774]: I1003 15:43:43.657255 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb_d70dce54-aa22-4af1-a341-4ff90ba78722/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:43:43 crc kubenswrapper[4774]: I1003 15:43:43.822800 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_cebee680-e845-4735-95f6-e97d844399a3/ceilometer-central-agent/0.log" Oct 03 15:43:43 crc kubenswrapper[4774]: I1003 15:43:43.876136 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_cebee680-e845-4735-95f6-e97d844399a3/ceilometer-notification-agent/0.log" Oct 03 15:43:44 crc kubenswrapper[4774]: I1003 15:43:44.011810 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_cebee680-e845-4735-95f6-e97d844399a3/sg-core/0.log" Oct 03 15:43:44 crc kubenswrapper[4774]: I1003 15:43:44.014865 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_cebee680-e845-4735-95f6-e97d844399a3/proxy-httpd/0.log" Oct 03 15:43:44 crc kubenswrapper[4774]: I1003 15:43:44.206081 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c2605586-9dec-4e4f-a61d-7a93535cbaa2/cinder-api-log/0.log" Oct 03 15:43:44 crc kubenswrapper[4774]: I1003 15:43:44.267649 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c2605586-9dec-4e4f-a61d-7a93535cbaa2/cinder-api/0.log" Oct 03 15:43:44 crc kubenswrapper[4774]: I1003 15:43:44.427231 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a370be32-2d52-48b7-b529-53e1d92a89a9/cinder-scheduler/0.log" Oct 03 15:43:44 crc kubenswrapper[4774]: I1003 15:43:44.475740 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a370be32-2d52-48b7-b529-53e1d92a89a9/probe/0.log" Oct 03 15:43:44 crc kubenswrapper[4774]: I1003 15:43:44.615852 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-pnvh8_9d27605e-3b35-4000-a3d5-88cecbf24b5a/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:43:44 crc kubenswrapper[4774]: I1003 15:43:44.877432 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-6shkp_9522d412-aaac-4917-86a0-2d9c40830b8d/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:43:44 crc kubenswrapper[4774]: I1003 15:43:44.991578 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-97dwl_365904b3-7404-4fe1-a9bf-c2711b345c08/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:43:45 crc kubenswrapper[4774]: I1003 15:43:45.090944 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-vrzcl_6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4/init/0.log" Oct 03 15:43:45 crc kubenswrapper[4774]: I1003 15:43:45.351938 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-vrzcl_6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4/dnsmasq-dns/0.log" Oct 03 15:43:45 crc kubenswrapper[4774]: I1003 15:43:45.364539 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-vrzcl_6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4/init/0.log" Oct 03 15:43:45 crc kubenswrapper[4774]: I1003 15:43:45.588242 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-bh568_58fbe1ae-46c5-4bb6-99ec-61ca05d737b1/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:43:45 crc kubenswrapper[4774]: I1003 15:43:45.695592 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_576a053d-3110-4bf1-a079-512e6bc51cbe/glance-httpd/0.log" Oct 03 15:43:45 crc kubenswrapper[4774]: I1003 15:43:45.801015 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_576a053d-3110-4bf1-a079-512e6bc51cbe/glance-log/0.log" Oct 03 15:43:45 crc kubenswrapper[4774]: I1003 15:43:45.961759 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_24f9335f-6756-4590-9e0a-6bc7bd1f4b3e/glance-httpd/0.log" Oct 03 15:43:46 crc kubenswrapper[4774]: I1003 15:43:46.011108 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_24f9335f-6756-4590-9e0a-6bc7bd1f4b3e/glance-log/0.log" Oct 03 15:43:46 crc kubenswrapper[4774]: I1003 15:43:46.278494 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-f865bb968-k9r7v_c0b4826d-75e2-4023-8d53-3ddd0da5bc2e/horizon/0.log" Oct 03 15:43:46 crc kubenswrapper[4774]: I1003 15:43:46.365140 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-x58cf_2ba66309-584b-4165-86a5-ca30af49d159/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:43:46 crc kubenswrapper[4774]: I1003 15:43:46.483079 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-f865bb968-k9r7v_c0b4826d-75e2-4023-8d53-3ddd0da5bc2e/horizon-log/0.log" Oct 03 15:43:46 crc kubenswrapper[4774]: I1003 15:43:46.572942 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-tzhjs_c8595599-9969-4c2e-bc3a-f2ff038d8c11/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:43:46 crc kubenswrapper[4774]: I1003 15:43:46.870822 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_a584407c-d7f9-436b-a293-fe97f4ed3c78/kube-state-metrics/0.log" Oct 03 15:43:47 crc kubenswrapper[4774]: I1003 15:43:47.055577 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7444dd849d-z82k5_9be0f44b-e4c6-475d-976b-d0b30b456b9c/keystone-api/0.log" Oct 03 15:43:47 crc kubenswrapper[4774]: I1003 15:43:47.113544 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2_41e52a3d-812e-4067-a30e-e9f4ad329411/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:43:47 crc kubenswrapper[4774]: I1003 15:43:47.438167 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-987467b4f-dts4l_c5ff7df6-e04c-431b-bdbb-2579172a7706/neutron-api/0.log" Oct 03 15:43:47 crc kubenswrapper[4774]: I1003 15:43:47.485700 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-987467b4f-dts4l_c5ff7df6-e04c-431b-bdbb-2579172a7706/neutron-httpd/0.log" Oct 03 15:43:47 crc kubenswrapper[4774]: I1003 15:43:47.681486 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6_62582218-c190-4edd-8539-5ca8e8d348e3/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:43:48 crc kubenswrapper[4774]: I1003 15:43:48.326718 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_96557def-81a7-44a3-86d4-72e10daa7d68/nova-api-log/0.log" Oct 03 15:43:48 crc kubenswrapper[4774]: I1003 15:43:48.430096 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_747916db-cbde-4597-be0a-1e2034b1afca/nova-cell0-conductor-conductor/0.log" Oct 03 15:43:48 crc kubenswrapper[4774]: I1003 15:43:48.601149 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_96557def-81a7-44a3-86d4-72e10daa7d68/nova-api-api/0.log" Oct 03 15:43:48 crc kubenswrapper[4774]: I1003 15:43:48.760858 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_18303598-0afd-48d0-a93a-a523807d8e37/nova-cell1-conductor-conductor/0.log" Oct 03 15:43:48 crc kubenswrapper[4774]: I1003 15:43:48.960387 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_fec122f7-2237-49c0-b5aa-4e251827b058/nova-cell1-novncproxy-novncproxy/0.log" Oct 03 15:43:49 crc kubenswrapper[4774]: I1003 15:43:49.144357 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-8vrlt_a3033c9b-77df-46fe-b9f6-34fedecfbdc8/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:43:49 crc kubenswrapper[4774]: I1003 15:43:49.460480 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_3208c93b-4d66-477f-8255-677d70a111a1/nova-metadata-log/0.log" Oct 03 15:43:49 crc kubenswrapper[4774]: I1003 15:43:49.844075 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_e38e5521-609a-4612-ad67-512c8a477e77/nova-scheduler-scheduler/0.log" Oct 03 15:43:50 crc kubenswrapper[4774]: I1003 15:43:50.057800 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_27254696-8788-47fe-a9a4-208fd295e427/mysql-bootstrap/0.log" Oct 03 15:43:50 crc kubenswrapper[4774]: I1003 15:43:50.430096 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_27254696-8788-47fe-a9a4-208fd295e427/mysql-bootstrap/0.log" Oct 03 15:43:50 crc kubenswrapper[4774]: I1003 15:43:50.470230 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_27254696-8788-47fe-a9a4-208fd295e427/galera/0.log" Oct 03 15:43:50 crc kubenswrapper[4774]: I1003 15:43:50.601660 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_3208c93b-4d66-477f-8255-677d70a111a1/nova-metadata-metadata/0.log" Oct 03 15:43:50 crc kubenswrapper[4774]: I1003 15:43:50.736092 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a5455d9b-4489-4041-b44d-990124dd84e4/mysql-bootstrap/0.log" Oct 03 15:43:50 crc kubenswrapper[4774]: I1003 15:43:50.967356 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a5455d9b-4489-4041-b44d-990124dd84e4/mysql-bootstrap/0.log" Oct 03 15:43:50 crc kubenswrapper[4774]: I1003 15:43:50.997343 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a5455d9b-4489-4041-b44d-990124dd84e4/galera/0.log" Oct 03 15:43:51 crc kubenswrapper[4774]: I1003 15:43:51.188779 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_f975364c-ff2b-49bb-9e5e-c0fea0d15daa/openstackclient/0.log" Oct 03 15:43:51 crc kubenswrapper[4774]: I1003 15:43:51.207877 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-4wgb7_b9111154-59d2-4b07-b8c3-db1870883cde/ovn-controller/0.log" Oct 03 15:43:51 crc kubenswrapper[4774]: I1003 15:43:51.432364 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-kh252_b62dc8be-5127-4f62-bdf9-f1db2425c2c1/openstack-network-exporter/0.log" Oct 03 15:43:51 crc kubenswrapper[4774]: I1003 15:43:51.762676 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bhmcl_03ee58bb-cd78-4fdb-986f-a9b60f9998e8/ovsdb-server-init/0.log" Oct 03 15:43:51 crc kubenswrapper[4774]: I1003 15:43:51.922185 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bhmcl_03ee58bb-cd78-4fdb-986f-a9b60f9998e8/ovsdb-server-init/0.log" Oct 03 15:43:52 crc kubenswrapper[4774]: I1003 15:43:52.039883 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bhmcl_03ee58bb-cd78-4fdb-986f-a9b60f9998e8/ovs-vswitchd/0.log" Oct 03 15:43:52 crc kubenswrapper[4774]: I1003 15:43:52.052897 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bhmcl_03ee58bb-cd78-4fdb-986f-a9b60f9998e8/ovsdb-server/0.log" Oct 03 15:43:52 crc kubenswrapper[4774]: I1003 15:43:52.328178 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-45z47_579db85b-3b4c-45b4-8bae-1b5a02c80e15/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:43:52 crc kubenswrapper[4774]: I1003 15:43:52.399418 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_148decd8-3262-44d1-858e-523458f7c1ee/openstack-network-exporter/0.log" Oct 03 15:43:52 crc kubenswrapper[4774]: I1003 15:43:52.575187 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_148decd8-3262-44d1-858e-523458f7c1ee/ovn-northd/0.log" Oct 03 15:43:52 crc kubenswrapper[4774]: I1003 15:43:52.616983 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66/openstack-network-exporter/0.log" Oct 03 15:43:52 crc kubenswrapper[4774]: I1003 15:43:52.830210 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66/ovsdbserver-nb/0.log" Oct 03 15:43:52 crc kubenswrapper[4774]: I1003 15:43:52.859867 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c880318a-5ff5-46f8-aca9-134c52ed3ad1/openstack-network-exporter/0.log" Oct 03 15:43:53 crc kubenswrapper[4774]: I1003 15:43:53.063125 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c880318a-5ff5-46f8-aca9-134c52ed3ad1/ovsdbserver-sb/0.log" Oct 03 15:43:53 crc kubenswrapper[4774]: I1003 15:43:53.129560 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-bc86db558-frxdt_e952bb63-3c66-43a3-a8ef-34e636f1b400/placement-api/0.log" Oct 03 15:43:53 crc kubenswrapper[4774]: I1003 15:43:53.385015 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_417bcf92-1c5e-4977-a197-62b603b795a2/setup-container/0.log" Oct 03 15:43:53 crc kubenswrapper[4774]: I1003 15:43:53.403658 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-bc86db558-frxdt_e952bb63-3c66-43a3-a8ef-34e636f1b400/placement-log/0.log" Oct 03 15:43:53 crc kubenswrapper[4774]: I1003 15:43:53.530130 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_417bcf92-1c5e-4977-a197-62b603b795a2/setup-container/0.log" Oct 03 15:43:53 crc kubenswrapper[4774]: I1003 15:43:53.611474 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_417bcf92-1c5e-4977-a197-62b603b795a2/rabbitmq/0.log" Oct 03 15:43:53 crc kubenswrapper[4774]: I1003 15:43:53.777037 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6d6cfa86-4356-4d79-9edd-977355592186/setup-container/0.log" Oct 03 15:43:53 crc kubenswrapper[4774]: I1003 15:43:53.998693 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6d6cfa86-4356-4d79-9edd-977355592186/setup-container/0.log" Oct 03 15:43:54 crc kubenswrapper[4774]: I1003 15:43:54.027787 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6d6cfa86-4356-4d79-9edd-977355592186/rabbitmq/0.log" Oct 03 15:43:54 crc kubenswrapper[4774]: I1003 15:43:54.289473 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-vchpx_4d2ae95f-a86b-4b58-a529-7b5d426bff79/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:43:54 crc kubenswrapper[4774]: I1003 15:43:54.295719 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-dshx4_776f45f5-5644-428e-a25f-9e3b36960fd9/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:43:54 crc kubenswrapper[4774]: I1003 15:43:54.497984 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh_136d617b-f485-4841-b6e2-350b591cd22e/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:43:54 crc kubenswrapper[4774]: I1003 15:43:54.674364 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-mrdws_81364ab7-a73d-4fef-b065-62983751634b/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:43:54 crc kubenswrapper[4774]: I1003 15:43:54.773133 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-rnpx9_04fa8c76-a20b-4ae9-86f0-fe4801763d0e/ssh-known-hosts-edpm-deployment/0.log" Oct 03 15:43:55 crc kubenswrapper[4774]: I1003 15:43:55.009199 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5c49d658df-5r5zg_4f82147b-63cd-44bc-8950-bf87fa407688/proxy-server/0.log" Oct 03 15:43:55 crc kubenswrapper[4774]: I1003 15:43:55.106394 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5c49d658df-5r5zg_4f82147b-63cd-44bc-8950-bf87fa407688/proxy-httpd/0.log" Oct 03 15:43:55 crc kubenswrapper[4774]: I1003 15:43:55.212448 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-xxj5b_5244bd24-f205-4576-a7cd-6da859f28e21/swift-ring-rebalance/0.log" Oct 03 15:43:55 crc kubenswrapper[4774]: I1003 15:43:55.413943 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc0b2c39-8c1e-4401-97f6-a4306b435436/account-auditor/0.log" Oct 03 15:43:55 crc kubenswrapper[4774]: I1003 15:43:55.435174 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc0b2c39-8c1e-4401-97f6-a4306b435436/account-reaper/0.log" Oct 03 15:43:55 crc kubenswrapper[4774]: I1003 15:43:55.624859 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc0b2c39-8c1e-4401-97f6-a4306b435436/account-replicator/0.log" Oct 03 15:43:55 crc kubenswrapper[4774]: I1003 15:43:55.677869 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc0b2c39-8c1e-4401-97f6-a4306b435436/account-server/0.log" Oct 03 15:43:55 crc kubenswrapper[4774]: I1003 15:43:55.681908 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc0b2c39-8c1e-4401-97f6-a4306b435436/container-auditor/0.log" Oct 03 15:43:55 crc kubenswrapper[4774]: I1003 15:43:55.840437 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc0b2c39-8c1e-4401-97f6-a4306b435436/container-replicator/0.log" Oct 03 15:43:55 crc kubenswrapper[4774]: I1003 15:43:55.879536 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc0b2c39-8c1e-4401-97f6-a4306b435436/container-server/0.log" Oct 03 15:43:55 crc kubenswrapper[4774]: I1003 15:43:55.934713 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc0b2c39-8c1e-4401-97f6-a4306b435436/container-updater/0.log" Oct 03 15:43:56 crc kubenswrapper[4774]: I1003 15:43:56.075198 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc0b2c39-8c1e-4401-97f6-a4306b435436/object-auditor/0.log" Oct 03 15:43:56 crc kubenswrapper[4774]: I1003 15:43:56.134876 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc0b2c39-8c1e-4401-97f6-a4306b435436/object-expirer/0.log" Oct 03 15:43:56 crc kubenswrapper[4774]: I1003 15:43:56.206174 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc0b2c39-8c1e-4401-97f6-a4306b435436/object-replicator/0.log" Oct 03 15:43:56 crc kubenswrapper[4774]: I1003 15:43:56.299582 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc0b2c39-8c1e-4401-97f6-a4306b435436/object-server/0.log" Oct 03 15:43:56 crc kubenswrapper[4774]: I1003 15:43:56.359795 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc0b2c39-8c1e-4401-97f6-a4306b435436/object-updater/0.log" Oct 03 15:43:56 crc kubenswrapper[4774]: I1003 15:43:56.440393 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc0b2c39-8c1e-4401-97f6-a4306b435436/rsync/0.log" Oct 03 15:43:56 crc kubenswrapper[4774]: I1003 15:43:56.481032 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc0b2c39-8c1e-4401-97f6-a4306b435436/swift-recon-cron/0.log" Oct 03 15:43:56 crc kubenswrapper[4774]: I1003 15:43:56.677111 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7_433217d2-80d5-452b-9980-c1aaac39b5c1/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:43:56 crc kubenswrapper[4774]: I1003 15:43:56.892702 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_7b63f9fa-2194-46e4-bfe0-d7efb33f10fb/tempest-tests-tempest-tests-runner/0.log" Oct 03 15:43:56 crc kubenswrapper[4774]: I1003 15:43:56.922586 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_90552d32-4d94-4fc4-b843-60a78206b347/test-operator-logs-container/0.log" Oct 03 15:43:57 crc kubenswrapper[4774]: I1003 15:43:57.097907 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-lkc9p_b021b515-09e3-4fcd-b448-c8169043f86c/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:44:04 crc kubenswrapper[4774]: I1003 15:44:04.446847 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_09034f5f-3011-4604-8b05-f8a3fef6a74a/memcached/0.log" Oct 03 15:44:20 crc kubenswrapper[4774]: I1003 15:44:20.654033 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:44:20 crc kubenswrapper[4774]: I1003 15:44:20.654918 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:44:47 crc kubenswrapper[4774]: I1003 15:44:47.386692 4774 generic.go:334] "Generic (PLEG): container finished" podID="82c95baa-90fd-473a-9e2f-03ed4c4356e8" containerID="3a42f66e98606607a8760dca189f553ab933febe9eec76a246d64b849d67b91b" exitCode=0 Oct 03 15:44:47 crc kubenswrapper[4774]: I1003 15:44:47.386739 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r575n/crc-debug-mpbnh" event={"ID":"82c95baa-90fd-473a-9e2f-03ed4c4356e8","Type":"ContainerDied","Data":"3a42f66e98606607a8760dca189f553ab933febe9eec76a246d64b849d67b91b"} Oct 03 15:44:48 crc kubenswrapper[4774]: I1003 15:44:48.527095 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r575n/crc-debug-mpbnh" Oct 03 15:44:48 crc kubenswrapper[4774]: I1003 15:44:48.603091 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-r575n/crc-debug-mpbnh"] Oct 03 15:44:48 crc kubenswrapper[4774]: I1003 15:44:48.611542 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-r575n/crc-debug-mpbnh"] Oct 03 15:44:48 crc kubenswrapper[4774]: I1003 15:44:48.708806 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpx6l\" (UniqueName: \"kubernetes.io/projected/82c95baa-90fd-473a-9e2f-03ed4c4356e8-kube-api-access-gpx6l\") pod \"82c95baa-90fd-473a-9e2f-03ed4c4356e8\" (UID: \"82c95baa-90fd-473a-9e2f-03ed4c4356e8\") " Oct 03 15:44:48 crc kubenswrapper[4774]: I1003 15:44:48.708927 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82c95baa-90fd-473a-9e2f-03ed4c4356e8-host\") pod \"82c95baa-90fd-473a-9e2f-03ed4c4356e8\" (UID: \"82c95baa-90fd-473a-9e2f-03ed4c4356e8\") " Oct 03 15:44:48 crc kubenswrapper[4774]: I1003 15:44:48.709107 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82c95baa-90fd-473a-9e2f-03ed4c4356e8-host" (OuterVolumeSpecName: "host") pod "82c95baa-90fd-473a-9e2f-03ed4c4356e8" (UID: "82c95baa-90fd-473a-9e2f-03ed4c4356e8"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 15:44:48 crc kubenswrapper[4774]: I1003 15:44:48.710032 4774 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82c95baa-90fd-473a-9e2f-03ed4c4356e8-host\") on node \"crc\" DevicePath \"\"" Oct 03 15:44:48 crc kubenswrapper[4774]: I1003 15:44:48.717230 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82c95baa-90fd-473a-9e2f-03ed4c4356e8-kube-api-access-gpx6l" (OuterVolumeSpecName: "kube-api-access-gpx6l") pod "82c95baa-90fd-473a-9e2f-03ed4c4356e8" (UID: "82c95baa-90fd-473a-9e2f-03ed4c4356e8"). InnerVolumeSpecName "kube-api-access-gpx6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:44:48 crc kubenswrapper[4774]: I1003 15:44:48.812364 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpx6l\" (UniqueName: \"kubernetes.io/projected/82c95baa-90fd-473a-9e2f-03ed4c4356e8-kube-api-access-gpx6l\") on node \"crc\" DevicePath \"\"" Oct 03 15:44:49 crc kubenswrapper[4774]: I1003 15:44:49.321847 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82c95baa-90fd-473a-9e2f-03ed4c4356e8" path="/var/lib/kubelet/pods/82c95baa-90fd-473a-9e2f-03ed4c4356e8/volumes" Oct 03 15:44:49 crc kubenswrapper[4774]: I1003 15:44:49.417900 4774 scope.go:117] "RemoveContainer" containerID="3a42f66e98606607a8760dca189f553ab933febe9eec76a246d64b849d67b91b" Oct 03 15:44:49 crc kubenswrapper[4774]: I1003 15:44:49.417990 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r575n/crc-debug-mpbnh" Oct 03 15:44:49 crc kubenswrapper[4774]: I1003 15:44:49.751119 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r575n/crc-debug-k684m"] Oct 03 15:44:49 crc kubenswrapper[4774]: E1003 15:44:49.751648 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82c95baa-90fd-473a-9e2f-03ed4c4356e8" containerName="container-00" Oct 03 15:44:49 crc kubenswrapper[4774]: I1003 15:44:49.751660 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="82c95baa-90fd-473a-9e2f-03ed4c4356e8" containerName="container-00" Oct 03 15:44:49 crc kubenswrapper[4774]: I1003 15:44:49.751906 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="82c95baa-90fd-473a-9e2f-03ed4c4356e8" containerName="container-00" Oct 03 15:44:49 crc kubenswrapper[4774]: I1003 15:44:49.752686 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r575n/crc-debug-k684m" Oct 03 15:44:49 crc kubenswrapper[4774]: I1003 15:44:49.934696 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b77c7168-15cf-4a2f-a5be-465b5ff39fa7-host\") pod \"crc-debug-k684m\" (UID: \"b77c7168-15cf-4a2f-a5be-465b5ff39fa7\") " pod="openshift-must-gather-r575n/crc-debug-k684m" Oct 03 15:44:49 crc kubenswrapper[4774]: I1003 15:44:49.935409 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf6nf\" (UniqueName: \"kubernetes.io/projected/b77c7168-15cf-4a2f-a5be-465b5ff39fa7-kube-api-access-mf6nf\") pod \"crc-debug-k684m\" (UID: \"b77c7168-15cf-4a2f-a5be-465b5ff39fa7\") " pod="openshift-must-gather-r575n/crc-debug-k684m" Oct 03 15:44:50 crc kubenswrapper[4774]: I1003 15:44:50.038743 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf6nf\" (UniqueName: \"kubernetes.io/projected/b77c7168-15cf-4a2f-a5be-465b5ff39fa7-kube-api-access-mf6nf\") pod \"crc-debug-k684m\" (UID: \"b77c7168-15cf-4a2f-a5be-465b5ff39fa7\") " pod="openshift-must-gather-r575n/crc-debug-k684m" Oct 03 15:44:50 crc kubenswrapper[4774]: I1003 15:44:50.038826 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b77c7168-15cf-4a2f-a5be-465b5ff39fa7-host\") pod \"crc-debug-k684m\" (UID: \"b77c7168-15cf-4a2f-a5be-465b5ff39fa7\") " pod="openshift-must-gather-r575n/crc-debug-k684m" Oct 03 15:44:50 crc kubenswrapper[4774]: I1003 15:44:50.039235 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b77c7168-15cf-4a2f-a5be-465b5ff39fa7-host\") pod \"crc-debug-k684m\" (UID: \"b77c7168-15cf-4a2f-a5be-465b5ff39fa7\") " pod="openshift-must-gather-r575n/crc-debug-k684m" Oct 03 15:44:50 crc kubenswrapper[4774]: I1003 15:44:50.060163 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf6nf\" (UniqueName: \"kubernetes.io/projected/b77c7168-15cf-4a2f-a5be-465b5ff39fa7-kube-api-access-mf6nf\") pod \"crc-debug-k684m\" (UID: \"b77c7168-15cf-4a2f-a5be-465b5ff39fa7\") " pod="openshift-must-gather-r575n/crc-debug-k684m" Oct 03 15:44:50 crc kubenswrapper[4774]: I1003 15:44:50.073195 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r575n/crc-debug-k684m" Oct 03 15:44:50 crc kubenswrapper[4774]: I1003 15:44:50.432939 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r575n/crc-debug-k684m" event={"ID":"b77c7168-15cf-4a2f-a5be-465b5ff39fa7","Type":"ContainerStarted","Data":"46fdfa35a21fd9aa7830c4e5f5abd673fa78a8fb0992e9575101e6911ad019a8"} Oct 03 15:44:50 crc kubenswrapper[4774]: I1003 15:44:50.433285 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r575n/crc-debug-k684m" event={"ID":"b77c7168-15cf-4a2f-a5be-465b5ff39fa7","Type":"ContainerStarted","Data":"c02df075132f43c1052efbf935cdcf7df67b6ebe2c87f86cd20a8b1856c45035"} Oct 03 15:44:50 crc kubenswrapper[4774]: I1003 15:44:50.653927 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:44:50 crc kubenswrapper[4774]: I1003 15:44:50.654006 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:44:51 crc kubenswrapper[4774]: I1003 15:44:51.447618 4774 generic.go:334] "Generic (PLEG): container finished" podID="b77c7168-15cf-4a2f-a5be-465b5ff39fa7" containerID="46fdfa35a21fd9aa7830c4e5f5abd673fa78a8fb0992e9575101e6911ad019a8" exitCode=0 Oct 03 15:44:51 crc kubenswrapper[4774]: I1003 15:44:51.447732 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r575n/crc-debug-k684m" event={"ID":"b77c7168-15cf-4a2f-a5be-465b5ff39fa7","Type":"ContainerDied","Data":"46fdfa35a21fd9aa7830c4e5f5abd673fa78a8fb0992e9575101e6911ad019a8"} Oct 03 15:44:52 crc kubenswrapper[4774]: I1003 15:44:52.564835 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r575n/crc-debug-k684m" Oct 03 15:44:52 crc kubenswrapper[4774]: I1003 15:44:52.684479 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b77c7168-15cf-4a2f-a5be-465b5ff39fa7-host\") pod \"b77c7168-15cf-4a2f-a5be-465b5ff39fa7\" (UID: \"b77c7168-15cf-4a2f-a5be-465b5ff39fa7\") " Oct 03 15:44:52 crc kubenswrapper[4774]: I1003 15:44:52.684568 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b77c7168-15cf-4a2f-a5be-465b5ff39fa7-host" (OuterVolumeSpecName: "host") pod "b77c7168-15cf-4a2f-a5be-465b5ff39fa7" (UID: "b77c7168-15cf-4a2f-a5be-465b5ff39fa7"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 15:44:52 crc kubenswrapper[4774]: I1003 15:44:52.684964 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mf6nf\" (UniqueName: \"kubernetes.io/projected/b77c7168-15cf-4a2f-a5be-465b5ff39fa7-kube-api-access-mf6nf\") pod \"b77c7168-15cf-4a2f-a5be-465b5ff39fa7\" (UID: \"b77c7168-15cf-4a2f-a5be-465b5ff39fa7\") " Oct 03 15:44:52 crc kubenswrapper[4774]: I1003 15:44:52.685413 4774 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b77c7168-15cf-4a2f-a5be-465b5ff39fa7-host\") on node \"crc\" DevicePath \"\"" Oct 03 15:44:52 crc kubenswrapper[4774]: I1003 15:44:52.690280 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b77c7168-15cf-4a2f-a5be-465b5ff39fa7-kube-api-access-mf6nf" (OuterVolumeSpecName: "kube-api-access-mf6nf") pod "b77c7168-15cf-4a2f-a5be-465b5ff39fa7" (UID: "b77c7168-15cf-4a2f-a5be-465b5ff39fa7"). InnerVolumeSpecName "kube-api-access-mf6nf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:44:52 crc kubenswrapper[4774]: I1003 15:44:52.786646 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mf6nf\" (UniqueName: \"kubernetes.io/projected/b77c7168-15cf-4a2f-a5be-465b5ff39fa7-kube-api-access-mf6nf\") on node \"crc\" DevicePath \"\"" Oct 03 15:44:53 crc kubenswrapper[4774]: I1003 15:44:53.465438 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r575n/crc-debug-k684m" event={"ID":"b77c7168-15cf-4a2f-a5be-465b5ff39fa7","Type":"ContainerDied","Data":"c02df075132f43c1052efbf935cdcf7df67b6ebe2c87f86cd20a8b1856c45035"} Oct 03 15:44:53 crc kubenswrapper[4774]: I1003 15:44:53.465478 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c02df075132f43c1052efbf935cdcf7df67b6ebe2c87f86cd20a8b1856c45035" Oct 03 15:44:53 crc kubenswrapper[4774]: I1003 15:44:53.465540 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r575n/crc-debug-k684m" Oct 03 15:44:57 crc kubenswrapper[4774]: I1003 15:44:57.992540 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-r575n/crc-debug-k684m"] Oct 03 15:44:58 crc kubenswrapper[4774]: I1003 15:44:58.005478 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-r575n/crc-debug-k684m"] Oct 03 15:44:59 crc kubenswrapper[4774]: I1003 15:44:59.207915 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r575n/crc-debug-smtpq"] Oct 03 15:44:59 crc kubenswrapper[4774]: E1003 15:44:59.208760 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b77c7168-15cf-4a2f-a5be-465b5ff39fa7" containerName="container-00" Oct 03 15:44:59 crc kubenswrapper[4774]: I1003 15:44:59.208777 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="b77c7168-15cf-4a2f-a5be-465b5ff39fa7" containerName="container-00" Oct 03 15:44:59 crc kubenswrapper[4774]: I1003 15:44:59.209029 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="b77c7168-15cf-4a2f-a5be-465b5ff39fa7" containerName="container-00" Oct 03 15:44:59 crc kubenswrapper[4774]: I1003 15:44:59.209971 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r575n/crc-debug-smtpq" Oct 03 15:44:59 crc kubenswrapper[4774]: I1003 15:44:59.286592 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66f7c\" (UniqueName: \"kubernetes.io/projected/ef2c9db9-b4bf-4c00-ab48-a43f8ae10bb3-kube-api-access-66f7c\") pod \"crc-debug-smtpq\" (UID: \"ef2c9db9-b4bf-4c00-ab48-a43f8ae10bb3\") " pod="openshift-must-gather-r575n/crc-debug-smtpq" Oct 03 15:44:59 crc kubenswrapper[4774]: I1003 15:44:59.286678 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef2c9db9-b4bf-4c00-ab48-a43f8ae10bb3-host\") pod \"crc-debug-smtpq\" (UID: \"ef2c9db9-b4bf-4c00-ab48-a43f8ae10bb3\") " pod="openshift-must-gather-r575n/crc-debug-smtpq" Oct 03 15:44:59 crc kubenswrapper[4774]: I1003 15:44:59.314155 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b77c7168-15cf-4a2f-a5be-465b5ff39fa7" path="/var/lib/kubelet/pods/b77c7168-15cf-4a2f-a5be-465b5ff39fa7/volumes" Oct 03 15:44:59 crc kubenswrapper[4774]: I1003 15:44:59.389744 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66f7c\" (UniqueName: \"kubernetes.io/projected/ef2c9db9-b4bf-4c00-ab48-a43f8ae10bb3-kube-api-access-66f7c\") pod \"crc-debug-smtpq\" (UID: \"ef2c9db9-b4bf-4c00-ab48-a43f8ae10bb3\") " pod="openshift-must-gather-r575n/crc-debug-smtpq" Oct 03 15:44:59 crc kubenswrapper[4774]: I1003 15:44:59.389868 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef2c9db9-b4bf-4c00-ab48-a43f8ae10bb3-host\") pod \"crc-debug-smtpq\" (UID: \"ef2c9db9-b4bf-4c00-ab48-a43f8ae10bb3\") " pod="openshift-must-gather-r575n/crc-debug-smtpq" Oct 03 15:44:59 crc kubenswrapper[4774]: I1003 15:44:59.390583 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef2c9db9-b4bf-4c00-ab48-a43f8ae10bb3-host\") pod \"crc-debug-smtpq\" (UID: \"ef2c9db9-b4bf-4c00-ab48-a43f8ae10bb3\") " pod="openshift-must-gather-r575n/crc-debug-smtpq" Oct 03 15:44:59 crc kubenswrapper[4774]: I1003 15:44:59.425072 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66f7c\" (UniqueName: \"kubernetes.io/projected/ef2c9db9-b4bf-4c00-ab48-a43f8ae10bb3-kube-api-access-66f7c\") pod \"crc-debug-smtpq\" (UID: \"ef2c9db9-b4bf-4c00-ab48-a43f8ae10bb3\") " pod="openshift-must-gather-r575n/crc-debug-smtpq" Oct 03 15:44:59 crc kubenswrapper[4774]: I1003 15:44:59.531421 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r575n/crc-debug-smtpq" Oct 03 15:45:00 crc kubenswrapper[4774]: I1003 15:45:00.237668 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325105-72d9z"] Oct 03 15:45:00 crc kubenswrapper[4774]: I1003 15:45:00.241090 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325105-72d9z" Oct 03 15:45:00 crc kubenswrapper[4774]: I1003 15:45:00.248745 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 15:45:00 crc kubenswrapper[4774]: I1003 15:45:00.248743 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 15:45:00 crc kubenswrapper[4774]: I1003 15:45:00.251425 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325105-72d9z"] Oct 03 15:45:00 crc kubenswrapper[4774]: I1003 15:45:00.315959 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/498fbf9c-ad69-4822-a313-130c21ae73b1-secret-volume\") pod \"collect-profiles-29325105-72d9z\" (UID: \"498fbf9c-ad69-4822-a313-130c21ae73b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325105-72d9z" Oct 03 15:45:00 crc kubenswrapper[4774]: I1003 15:45:00.315998 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/498fbf9c-ad69-4822-a313-130c21ae73b1-config-volume\") pod \"collect-profiles-29325105-72d9z\" (UID: \"498fbf9c-ad69-4822-a313-130c21ae73b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325105-72d9z" Oct 03 15:45:00 crc kubenswrapper[4774]: I1003 15:45:00.316055 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cft8d\" (UniqueName: \"kubernetes.io/projected/498fbf9c-ad69-4822-a313-130c21ae73b1-kube-api-access-cft8d\") pod \"collect-profiles-29325105-72d9z\" (UID: \"498fbf9c-ad69-4822-a313-130c21ae73b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325105-72d9z" Oct 03 15:45:00 crc kubenswrapper[4774]: I1003 15:45:00.417755 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/498fbf9c-ad69-4822-a313-130c21ae73b1-config-volume\") pod \"collect-profiles-29325105-72d9z\" (UID: \"498fbf9c-ad69-4822-a313-130c21ae73b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325105-72d9z" Oct 03 15:45:00 crc kubenswrapper[4774]: I1003 15:45:00.417808 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/498fbf9c-ad69-4822-a313-130c21ae73b1-secret-volume\") pod \"collect-profiles-29325105-72d9z\" (UID: \"498fbf9c-ad69-4822-a313-130c21ae73b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325105-72d9z" Oct 03 15:45:00 crc kubenswrapper[4774]: I1003 15:45:00.417900 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cft8d\" (UniqueName: \"kubernetes.io/projected/498fbf9c-ad69-4822-a313-130c21ae73b1-kube-api-access-cft8d\") pod \"collect-profiles-29325105-72d9z\" (UID: \"498fbf9c-ad69-4822-a313-130c21ae73b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325105-72d9z" Oct 03 15:45:00 crc kubenswrapper[4774]: I1003 15:45:00.420092 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/498fbf9c-ad69-4822-a313-130c21ae73b1-config-volume\") pod \"collect-profiles-29325105-72d9z\" (UID: \"498fbf9c-ad69-4822-a313-130c21ae73b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325105-72d9z" Oct 03 15:45:00 crc kubenswrapper[4774]: I1003 15:45:00.424268 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/498fbf9c-ad69-4822-a313-130c21ae73b1-secret-volume\") pod \"collect-profiles-29325105-72d9z\" (UID: \"498fbf9c-ad69-4822-a313-130c21ae73b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325105-72d9z" Oct 03 15:45:00 crc kubenswrapper[4774]: I1003 15:45:00.452535 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cft8d\" (UniqueName: \"kubernetes.io/projected/498fbf9c-ad69-4822-a313-130c21ae73b1-kube-api-access-cft8d\") pod \"collect-profiles-29325105-72d9z\" (UID: \"498fbf9c-ad69-4822-a313-130c21ae73b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325105-72d9z" Oct 03 15:45:00 crc kubenswrapper[4774]: I1003 15:45:00.537088 4774 generic.go:334] "Generic (PLEG): container finished" podID="ef2c9db9-b4bf-4c00-ab48-a43f8ae10bb3" containerID="48dc07e142f5f554cfe327d80a5f024c05ea721a9d8169a38af0d03a393481f1" exitCode=0 Oct 03 15:45:00 crc kubenswrapper[4774]: I1003 15:45:00.537142 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r575n/crc-debug-smtpq" event={"ID":"ef2c9db9-b4bf-4c00-ab48-a43f8ae10bb3","Type":"ContainerDied","Data":"48dc07e142f5f554cfe327d80a5f024c05ea721a9d8169a38af0d03a393481f1"} Oct 03 15:45:00 crc kubenswrapper[4774]: I1003 15:45:00.537174 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r575n/crc-debug-smtpq" event={"ID":"ef2c9db9-b4bf-4c00-ab48-a43f8ae10bb3","Type":"ContainerStarted","Data":"03ea0b594b138b24ab690fd1188a2fd8f3a30e4df88c63355f7081c6ee4703da"} Oct 03 15:45:00 crc kubenswrapper[4774]: I1003 15:45:00.566563 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325105-72d9z" Oct 03 15:45:00 crc kubenswrapper[4774]: I1003 15:45:00.579503 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-r575n/crc-debug-smtpq"] Oct 03 15:45:00 crc kubenswrapper[4774]: I1003 15:45:00.587103 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-r575n/crc-debug-smtpq"] Oct 03 15:45:01 crc kubenswrapper[4774]: I1003 15:45:01.004617 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325105-72d9z"] Oct 03 15:45:01 crc kubenswrapper[4774]: W1003 15:45:01.011724 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod498fbf9c_ad69_4822_a313_130c21ae73b1.slice/crio-7d27cba2cd941bde1670454343535b916fc4992f3006b831747e9c0248a6e0c9 WatchSource:0}: Error finding container 7d27cba2cd941bde1670454343535b916fc4992f3006b831747e9c0248a6e0c9: Status 404 returned error can't find the container with id 7d27cba2cd941bde1670454343535b916fc4992f3006b831747e9c0248a6e0c9 Oct 03 15:45:01 crc kubenswrapper[4774]: I1003 15:45:01.554445 4774 generic.go:334] "Generic (PLEG): container finished" podID="498fbf9c-ad69-4822-a313-130c21ae73b1" containerID="28e39aa631f00a541e40bd78213ca3ba9ff57e4aa017ea2cb0992baba559dae4" exitCode=0 Oct 03 15:45:01 crc kubenswrapper[4774]: I1003 15:45:01.554488 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325105-72d9z" event={"ID":"498fbf9c-ad69-4822-a313-130c21ae73b1","Type":"ContainerDied","Data":"28e39aa631f00a541e40bd78213ca3ba9ff57e4aa017ea2cb0992baba559dae4"} Oct 03 15:45:01 crc kubenswrapper[4774]: I1003 15:45:01.554532 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325105-72d9z" event={"ID":"498fbf9c-ad69-4822-a313-130c21ae73b1","Type":"ContainerStarted","Data":"7d27cba2cd941bde1670454343535b916fc4992f3006b831747e9c0248a6e0c9"} Oct 03 15:45:01 crc kubenswrapper[4774]: I1003 15:45:01.656975 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r575n/crc-debug-smtpq" Oct 03 15:45:01 crc kubenswrapper[4774]: I1003 15:45:01.744713 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef2c9db9-b4bf-4c00-ab48-a43f8ae10bb3-host\") pod \"ef2c9db9-b4bf-4c00-ab48-a43f8ae10bb3\" (UID: \"ef2c9db9-b4bf-4c00-ab48-a43f8ae10bb3\") " Oct 03 15:45:01 crc kubenswrapper[4774]: I1003 15:45:01.744829 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef2c9db9-b4bf-4c00-ab48-a43f8ae10bb3-host" (OuterVolumeSpecName: "host") pod "ef2c9db9-b4bf-4c00-ab48-a43f8ae10bb3" (UID: "ef2c9db9-b4bf-4c00-ab48-a43f8ae10bb3"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 15:45:01 crc kubenswrapper[4774]: I1003 15:45:01.744859 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66f7c\" (UniqueName: \"kubernetes.io/projected/ef2c9db9-b4bf-4c00-ab48-a43f8ae10bb3-kube-api-access-66f7c\") pod \"ef2c9db9-b4bf-4c00-ab48-a43f8ae10bb3\" (UID: \"ef2c9db9-b4bf-4c00-ab48-a43f8ae10bb3\") " Oct 03 15:45:01 crc kubenswrapper[4774]: I1003 15:45:01.745462 4774 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef2c9db9-b4bf-4c00-ab48-a43f8ae10bb3-host\") on node \"crc\" DevicePath \"\"" Oct 03 15:45:01 crc kubenswrapper[4774]: I1003 15:45:01.750424 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef2c9db9-b4bf-4c00-ab48-a43f8ae10bb3-kube-api-access-66f7c" (OuterVolumeSpecName: "kube-api-access-66f7c") pod "ef2c9db9-b4bf-4c00-ab48-a43f8ae10bb3" (UID: "ef2c9db9-b4bf-4c00-ab48-a43f8ae10bb3"). InnerVolumeSpecName "kube-api-access-66f7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:45:01 crc kubenswrapper[4774]: I1003 15:45:01.847297 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66f7c\" (UniqueName: \"kubernetes.io/projected/ef2c9db9-b4bf-4c00-ab48-a43f8ae10bb3-kube-api-access-66f7c\") on node \"crc\" DevicePath \"\"" Oct 03 15:45:02 crc kubenswrapper[4774]: I1003 15:45:02.114842 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6c675fb79f-b54rk_cea60414-e959-4200-b3e5-e532d2136047/kube-rbac-proxy/0.log" Oct 03 15:45:02 crc kubenswrapper[4774]: I1003 15:45:02.134898 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6c675fb79f-b54rk_cea60414-e959-4200-b3e5-e532d2136047/manager/0.log" Oct 03 15:45:02 crc kubenswrapper[4774]: I1003 15:45:02.303299 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79d68d6c85-6gckj_4c3d1495-6568-44c2-9bd7-82256a4b5aab/kube-rbac-proxy/0.log" Oct 03 15:45:02 crc kubenswrapper[4774]: I1003 15:45:02.352048 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79d68d6c85-6gckj_4c3d1495-6568-44c2-9bd7-82256a4b5aab/manager/0.log" Oct 03 15:45:02 crc kubenswrapper[4774]: I1003 15:45:02.468404 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-m2wmk_4cf018fb-edab-4e23-ad04-763ee25e1613/kube-rbac-proxy/0.log" Oct 03 15:45:02 crc kubenswrapper[4774]: I1003 15:45:02.521060 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-m2wmk_4cf018fb-edab-4e23-ad04-763ee25e1613/manager/0.log" Oct 03 15:45:02 crc kubenswrapper[4774]: I1003 15:45:02.564530 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r575n/crc-debug-smtpq" Oct 03 15:45:02 crc kubenswrapper[4774]: I1003 15:45:02.564773 4774 scope.go:117] "RemoveContainer" containerID="48dc07e142f5f554cfe327d80a5f024c05ea721a9d8169a38af0d03a393481f1" Oct 03 15:45:02 crc kubenswrapper[4774]: I1003 15:45:02.571452 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh_128ae2e6-72c5-44ef-bf0a-6f54d80796cd/util/0.log" Oct 03 15:45:02 crc kubenswrapper[4774]: I1003 15:45:02.780859 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh_128ae2e6-72c5-44ef-bf0a-6f54d80796cd/util/0.log" Oct 03 15:45:02 crc kubenswrapper[4774]: I1003 15:45:02.785197 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh_128ae2e6-72c5-44ef-bf0a-6f54d80796cd/pull/0.log" Oct 03 15:45:02 crc kubenswrapper[4774]: I1003 15:45:02.829044 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh_128ae2e6-72c5-44ef-bf0a-6f54d80796cd/pull/0.log" Oct 03 15:45:02 crc kubenswrapper[4774]: I1003 15:45:02.891291 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325105-72d9z" Oct 03 15:45:02 crc kubenswrapper[4774]: I1003 15:45:02.967449 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/498fbf9c-ad69-4822-a313-130c21ae73b1-secret-volume\") pod \"498fbf9c-ad69-4822-a313-130c21ae73b1\" (UID: \"498fbf9c-ad69-4822-a313-130c21ae73b1\") " Oct 03 15:45:02 crc kubenswrapper[4774]: I1003 15:45:02.967572 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cft8d\" (UniqueName: \"kubernetes.io/projected/498fbf9c-ad69-4822-a313-130c21ae73b1-kube-api-access-cft8d\") pod \"498fbf9c-ad69-4822-a313-130c21ae73b1\" (UID: \"498fbf9c-ad69-4822-a313-130c21ae73b1\") " Oct 03 15:45:02 crc kubenswrapper[4774]: I1003 15:45:02.967711 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/498fbf9c-ad69-4822-a313-130c21ae73b1-config-volume\") pod \"498fbf9c-ad69-4822-a313-130c21ae73b1\" (UID: \"498fbf9c-ad69-4822-a313-130c21ae73b1\") " Oct 03 15:45:02 crc kubenswrapper[4774]: I1003 15:45:02.968298 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/498fbf9c-ad69-4822-a313-130c21ae73b1-config-volume" (OuterVolumeSpecName: "config-volume") pod "498fbf9c-ad69-4822-a313-130c21ae73b1" (UID: "498fbf9c-ad69-4822-a313-130c21ae73b1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:45:02 crc kubenswrapper[4774]: I1003 15:45:02.973659 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/498fbf9c-ad69-4822-a313-130c21ae73b1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "498fbf9c-ad69-4822-a313-130c21ae73b1" (UID: "498fbf9c-ad69-4822-a313-130c21ae73b1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:45:02 crc kubenswrapper[4774]: I1003 15:45:02.977624 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/498fbf9c-ad69-4822-a313-130c21ae73b1-kube-api-access-cft8d" (OuterVolumeSpecName: "kube-api-access-cft8d") pod "498fbf9c-ad69-4822-a313-130c21ae73b1" (UID: "498fbf9c-ad69-4822-a313-130c21ae73b1"). InnerVolumeSpecName "kube-api-access-cft8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:45:02 crc kubenswrapper[4774]: I1003 15:45:02.983988 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh_128ae2e6-72c5-44ef-bf0a-6f54d80796cd/pull/0.log" Oct 03 15:45:02 crc kubenswrapper[4774]: I1003 15:45:02.984729 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh_128ae2e6-72c5-44ef-bf0a-6f54d80796cd/extract/0.log" Oct 03 15:45:02 crc kubenswrapper[4774]: I1003 15:45:02.988711 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh_128ae2e6-72c5-44ef-bf0a-6f54d80796cd/util/0.log" Oct 03 15:45:03 crc kubenswrapper[4774]: I1003 15:45:03.070557 4774 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/498fbf9c-ad69-4822-a313-130c21ae73b1-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 15:45:03 crc kubenswrapper[4774]: I1003 15:45:03.070587 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cft8d\" (UniqueName: \"kubernetes.io/projected/498fbf9c-ad69-4822-a313-130c21ae73b1-kube-api-access-cft8d\") on node \"crc\" DevicePath \"\"" Oct 03 15:45:03 crc kubenswrapper[4774]: I1003 15:45:03.070596 4774 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/498fbf9c-ad69-4822-a313-130c21ae73b1-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 15:45:03 crc kubenswrapper[4774]: I1003 15:45:03.175213 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-846dff85b5-v5kkx_29905a36-139e-4611-bc8e-0289dd1fa0b4/kube-rbac-proxy/0.log" Oct 03 15:45:03 crc kubenswrapper[4774]: I1003 15:45:03.215431 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-846dff85b5-v5kkx_29905a36-139e-4611-bc8e-0289dd1fa0b4/manager/0.log" Oct 03 15:45:03 crc kubenswrapper[4774]: I1003 15:45:03.240325 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-599898f689-mhz7w_ecf9de8d-83d4-41cb-9b00-e3aeedfb93fd/kube-rbac-proxy/0.log" Oct 03 15:45:03 crc kubenswrapper[4774]: I1003 15:45:03.329584 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef2c9db9-b4bf-4c00-ab48-a43f8ae10bb3" path="/var/lib/kubelet/pods/ef2c9db9-b4bf-4c00-ab48-a43f8ae10bb3/volumes" Oct 03 15:45:03 crc kubenswrapper[4774]: I1003 15:45:03.379161 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-599898f689-mhz7w_ecf9de8d-83d4-41cb-9b00-e3aeedfb93fd/manager/0.log" Oct 03 15:45:03 crc kubenswrapper[4774]: I1003 15:45:03.422055 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6769b867d9-2tqv8_02bdd1b6-4d8f-40ce-b0fe-449c738d5d0e/kube-rbac-proxy/0.log" Oct 03 15:45:03 crc kubenswrapper[4774]: I1003 15:45:03.481637 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6769b867d9-2tqv8_02bdd1b6-4d8f-40ce-b0fe-449c738d5d0e/manager/0.log" Oct 03 15:45:03 crc kubenswrapper[4774]: I1003 15:45:03.575013 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325105-72d9z" event={"ID":"498fbf9c-ad69-4822-a313-130c21ae73b1","Type":"ContainerDied","Data":"7d27cba2cd941bde1670454343535b916fc4992f3006b831747e9c0248a6e0c9"} Oct 03 15:45:03 crc kubenswrapper[4774]: I1003 15:45:03.575071 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d27cba2cd941bde1670454343535b916fc4992f3006b831747e9c0248a6e0c9" Oct 03 15:45:03 crc kubenswrapper[4774]: I1003 15:45:03.576913 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325105-72d9z" Oct 03 15:45:03 crc kubenswrapper[4774]: I1003 15:45:03.601704 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5fbf469cd7-zhmr8_bc3a311f-a6c2-40e4-aaae-549aa2395c57/kube-rbac-proxy/0.log" Oct 03 15:45:03 crc kubenswrapper[4774]: I1003 15:45:03.742403 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5fbf469cd7-zhmr8_bc3a311f-a6c2-40e4-aaae-549aa2395c57/manager/0.log" Oct 03 15:45:03 crc kubenswrapper[4774]: I1003 15:45:03.768047 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-84bc9db6cc-2sg6f_741c17eb-da65-4dce-abc6-7faa47d28004/kube-rbac-proxy/0.log" Oct 03 15:45:03 crc kubenswrapper[4774]: I1003 15:45:03.800128 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-84bc9db6cc-2sg6f_741c17eb-da65-4dce-abc6-7faa47d28004/manager/0.log" Oct 03 15:45:03 crc kubenswrapper[4774]: I1003 15:45:03.959254 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325060-lqwfp"] Oct 03 15:45:03 crc kubenswrapper[4774]: I1003 15:45:03.966184 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325060-lqwfp"] Oct 03 15:45:03 crc kubenswrapper[4774]: I1003 15:45:03.989725 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7f55849f88-fw4zb_b32ca090-1129-4c77-a2b3-df9e51a35a48/manager/0.log" Oct 03 15:45:03 crc kubenswrapper[4774]: I1003 15:45:03.999239 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7f55849f88-fw4zb_b32ca090-1129-4c77-a2b3-df9e51a35a48/kube-rbac-proxy/0.log" Oct 03 15:45:04 crc kubenswrapper[4774]: I1003 15:45:04.088264 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6fd6854b49-24bvc_10674e8a-5afd-45f7-af36-e9dbfaf2dba0/kube-rbac-proxy/0.log" Oct 03 15:45:04 crc kubenswrapper[4774]: I1003 15:45:04.160336 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6fd6854b49-24bvc_10674e8a-5afd-45f7-af36-e9dbfaf2dba0/manager/0.log" Oct 03 15:45:04 crc kubenswrapper[4774]: I1003 15:45:04.186564 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5c468bf4d4-mtjh7_6edea7f2-581f-4f41-bdda-45e83dce680d/kube-rbac-proxy/0.log" Oct 03 15:45:04 crc kubenswrapper[4774]: I1003 15:45:04.279656 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5c468bf4d4-mtjh7_6edea7f2-581f-4f41-bdda-45e83dce680d/manager/0.log" Oct 03 15:45:04 crc kubenswrapper[4774]: I1003 15:45:04.395609 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6574bf987d-nw7mh_eaf48bda-c7ca-484b-8d8f-b195d011e8f9/kube-rbac-proxy/0.log" Oct 03 15:45:04 crc kubenswrapper[4774]: I1003 15:45:04.438303 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6574bf987d-nw7mh_eaf48bda-c7ca-484b-8d8f-b195d011e8f9/manager/0.log" Oct 03 15:45:04 crc kubenswrapper[4774]: I1003 15:45:04.522332 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-555c7456bd-fhd72_74855628-79e3-4300-8a8b-d05aeed1904b/kube-rbac-proxy/0.log" Oct 03 15:45:04 crc kubenswrapper[4774]: I1003 15:45:04.630803 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-59d6cfdf45-84vwj_5c00d52a-acc5-4650-8b36-48faa90030a3/kube-rbac-proxy/0.log" Oct 03 15:45:04 crc kubenswrapper[4774]: I1003 15:45:04.651578 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-555c7456bd-fhd72_74855628-79e3-4300-8a8b-d05aeed1904b/manager/0.log" Oct 03 15:45:04 crc kubenswrapper[4774]: I1003 15:45:04.735113 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-59d6cfdf45-84vwj_5c00d52a-acc5-4650-8b36-48faa90030a3/manager/0.log" Oct 03 15:45:04 crc kubenswrapper[4774]: I1003 15:45:04.831491 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6f64c4d678chmhg_2b82b2ba-da6d-4441-a194-4b47207b159a/manager/0.log" Oct 03 15:45:04 crc kubenswrapper[4774]: I1003 15:45:04.832432 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6f64c4d678chmhg_2b82b2ba-da6d-4441-a194-4b47207b159a/kube-rbac-proxy/0.log" Oct 03 15:45:04 crc kubenswrapper[4774]: I1003 15:45:04.994969 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6977957f88-8kmrq_73dd9462-cd4d-40d8-a416-c8ed1ef328fb/kube-rbac-proxy/0.log" Oct 03 15:45:05 crc kubenswrapper[4774]: I1003 15:45:05.125486 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7c89b76849-9klgg_4ad90ede-158d-4798-a2d5-399d61654604/kube-rbac-proxy/0.log" Oct 03 15:45:05 crc kubenswrapper[4774]: I1003 15:45:05.313954 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba5eed25-71ac-44e3-bf15-daf7b9ac13c6" path="/var/lib/kubelet/pods/ba5eed25-71ac-44e3-bf15-daf7b9ac13c6/volumes" Oct 03 15:45:05 crc kubenswrapper[4774]: I1003 15:45:05.327871 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-2t96w_a97efab2-9188-4828-a600-d346b724f1f9/registry-server/0.log" Oct 03 15:45:05 crc kubenswrapper[4774]: I1003 15:45:05.475824 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7c89b76849-9klgg_4ad90ede-158d-4798-a2d5-399d61654604/operator/0.log" Oct 03 15:45:05 crc kubenswrapper[4774]: I1003 15:45:05.490141 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-688db7b6c7-c7hxr_14fc26c3-ab56-44a5-832c-55eaca43cc5c/kube-rbac-proxy/0.log" Oct 03 15:45:05 crc kubenswrapper[4774]: I1003 15:45:05.641348 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-688db7b6c7-c7hxr_14fc26c3-ab56-44a5-832c-55eaca43cc5c/manager/0.log" Oct 03 15:45:05 crc kubenswrapper[4774]: I1003 15:45:05.693760 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-7d8bb7f44c-pqrxt_35aefd42-2274-451a-8526-fb99c1f72be0/manager/0.log" Oct 03 15:45:05 crc kubenswrapper[4774]: I1003 15:45:05.750920 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-7d8bb7f44c-pqrxt_35aefd42-2274-451a-8526-fb99c1f72be0/kube-rbac-proxy/0.log" Oct 03 15:45:05 crc kubenswrapper[4774]: I1003 15:45:05.896052 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-xqmql_77b9687d-958c-47ad-835e-160fc6214d72/operator/0.log" Oct 03 15:45:05 crc kubenswrapper[4774]: I1003 15:45:05.976653 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-j5d2d_e9d1e188-b3ff-4807-a57e-9bf290e10f22/kube-rbac-proxy/0.log" Oct 03 15:45:06 crc kubenswrapper[4774]: I1003 15:45:06.023765 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6977957f88-8kmrq_73dd9462-cd4d-40d8-a416-c8ed1ef328fb/manager/0.log" Oct 03 15:45:06 crc kubenswrapper[4774]: I1003 15:45:06.099167 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-j5d2d_e9d1e188-b3ff-4807-a57e-9bf290e10f22/manager/0.log" Oct 03 15:45:06 crc kubenswrapper[4774]: I1003 15:45:06.135013 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5db5cf686f-zpq8n_fcb0af6a-547c-4555-86f9-f0b390ae7ce3/kube-rbac-proxy/0.log" Oct 03 15:45:06 crc kubenswrapper[4774]: I1003 15:45:06.207789 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5db5cf686f-zpq8n_fcb0af6a-547c-4555-86f9-f0b390ae7ce3/manager/0.log" Oct 03 15:45:06 crc kubenswrapper[4774]: I1003 15:45:06.291337 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-ftzfn_6f1973d7-94ab-4855-bfd4-91f1e677306f/manager/0.log" Oct 03 15:45:06 crc kubenswrapper[4774]: I1003 15:45:06.299202 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-ftzfn_6f1973d7-94ab-4855-bfd4-91f1e677306f/kube-rbac-proxy/0.log" Oct 03 15:45:06 crc kubenswrapper[4774]: I1003 15:45:06.390453 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-fcd7d9895-ngbtk_b8324f27-b72f-4ad9-adcb-82469098520a/kube-rbac-proxy/0.log" Oct 03 15:45:06 crc kubenswrapper[4774]: I1003 15:45:06.421476 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-fcd7d9895-ngbtk_b8324f27-b72f-4ad9-adcb-82469098520a/manager/0.log" Oct 03 15:45:20 crc kubenswrapper[4774]: I1003 15:45:20.653811 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:45:20 crc kubenswrapper[4774]: I1003 15:45:20.654306 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:45:20 crc kubenswrapper[4774]: I1003 15:45:20.654351 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" Oct 03 15:45:20 crc kubenswrapper[4774]: I1003 15:45:20.655137 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6a9aac6dbc322b9090f9a59162378fc7732101d04a69222bfa4d228837707411"} pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 15:45:20 crc kubenswrapper[4774]: I1003 15:45:20.655189 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" containerID="cri-o://6a9aac6dbc322b9090f9a59162378fc7732101d04a69222bfa4d228837707411" gracePeriod=600 Oct 03 15:45:20 crc kubenswrapper[4774]: E1003 15:45:20.797847 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:45:20 crc kubenswrapper[4774]: I1003 15:45:20.990623 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-7qxcv_98cf7a02-ed29-4cd4-9f60-77659f186e4b/control-plane-machine-set-operator/0.log" Oct 03 15:45:21 crc kubenswrapper[4774]: I1003 15:45:21.106190 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7nrwv_b5e417ed-5b5e-405b-8b95-ed27ddaef9ee/machine-api-operator/0.log" Oct 03 15:45:21 crc kubenswrapper[4774]: I1003 15:45:21.128109 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7nrwv_b5e417ed-5b5e-405b-8b95-ed27ddaef9ee/kube-rbac-proxy/0.log" Oct 03 15:45:21 crc kubenswrapper[4774]: I1003 15:45:21.728650 4774 generic.go:334] "Generic (PLEG): container finished" podID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerID="6a9aac6dbc322b9090f9a59162378fc7732101d04a69222bfa4d228837707411" exitCode=0 Oct 03 15:45:21 crc kubenswrapper[4774]: I1003 15:45:21.728703 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" event={"ID":"ca37ac4b-f421-4198-a179-12901d36f0f5","Type":"ContainerDied","Data":"6a9aac6dbc322b9090f9a59162378fc7732101d04a69222bfa4d228837707411"} Oct 03 15:45:21 crc kubenswrapper[4774]: I1003 15:45:21.728847 4774 scope.go:117] "RemoveContainer" containerID="f2ac2522d841de5c1ea1ba89a06aceeed8a354385c6834a910301458a95a43d3" Oct 03 15:45:21 crc kubenswrapper[4774]: I1003 15:45:21.729734 4774 scope.go:117] "RemoveContainer" containerID="6a9aac6dbc322b9090f9a59162378fc7732101d04a69222bfa4d228837707411" Oct 03 15:45:21 crc kubenswrapper[4774]: E1003 15:45:21.730268 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:45:27 crc kubenswrapper[4774]: I1003 15:45:27.333192 4774 scope.go:117] "RemoveContainer" containerID="6280c8b355da60f28b4280f79f5bac30a90c851afd14db16666528c1575c1830" Oct 03 15:45:33 crc kubenswrapper[4774]: I1003 15:45:33.751147 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-h2fts_769d7391-5628-4bcd-af8d-accf8b37c400/cert-manager-controller/0.log" Oct 03 15:45:33 crc kubenswrapper[4774]: I1003 15:45:33.894937 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-gghpw_1f99341f-994f-496c-9287-f0fa80429b74/cert-manager-cainjector/0.log" Oct 03 15:45:33 crc kubenswrapper[4774]: I1003 15:45:33.941811 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-hggn6_3d3ba37a-f1af-431c-a733-19283fc5c055/cert-manager-webhook/0.log" Oct 03 15:45:37 crc kubenswrapper[4774]: I1003 15:45:37.300737 4774 scope.go:117] "RemoveContainer" containerID="6a9aac6dbc322b9090f9a59162378fc7732101d04a69222bfa4d228837707411" Oct 03 15:45:37 crc kubenswrapper[4774]: E1003 15:45:37.301414 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:45:47 crc kubenswrapper[4774]: I1003 15:45:47.120646 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-p8hlt_e1390648-f53e-4c1c-a801-2c76be9fa959/nmstate-console-plugin/0.log" Oct 03 15:45:47 crc kubenswrapper[4774]: I1003 15:45:47.302360 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-bc8h5_6aaa4a80-d658-4f5a-8b6c-cc84ad781c64/nmstate-handler/0.log" Oct 03 15:45:47 crc kubenswrapper[4774]: I1003 15:45:47.342438 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-4lb7n_bd99f12e-3622-42d7-bece-2149da359b49/kube-rbac-proxy/0.log" Oct 03 15:45:47 crc kubenswrapper[4774]: I1003 15:45:47.434792 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-4lb7n_bd99f12e-3622-42d7-bece-2149da359b49/nmstate-metrics/0.log" Oct 03 15:45:47 crc kubenswrapper[4774]: I1003 15:45:47.505736 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-59jwx_a0e3f6b9-0e1a-4627-aad6-b6252f7f4ab9/nmstate-operator/0.log" Oct 03 15:45:47 crc kubenswrapper[4774]: I1003 15:45:47.621186 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-hrzkg_e582dd92-97d1-48bf-a81b-ad144d0a89cf/nmstate-webhook/0.log" Oct 03 15:45:49 crc kubenswrapper[4774]: I1003 15:45:49.312430 4774 scope.go:117] "RemoveContainer" containerID="6a9aac6dbc322b9090f9a59162378fc7732101d04a69222bfa4d228837707411" Oct 03 15:45:49 crc kubenswrapper[4774]: E1003 15:45:49.312911 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:46:03 crc kubenswrapper[4774]: I1003 15:46:03.016393 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-qp2nq_625d3121-98b6-42e6-bc58-ea4bbdc5a7ad/kube-rbac-proxy/0.log" Oct 03 15:46:03 crc kubenswrapper[4774]: I1003 15:46:03.156597 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-qp2nq_625d3121-98b6-42e6-bc58-ea4bbdc5a7ad/controller/0.log" Oct 03 15:46:03 crc kubenswrapper[4774]: I1003 15:46:03.239533 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/cp-frr-files/0.log" Oct 03 15:46:03 crc kubenswrapper[4774]: I1003 15:46:03.300237 4774 scope.go:117] "RemoveContainer" containerID="6a9aac6dbc322b9090f9a59162378fc7732101d04a69222bfa4d228837707411" Oct 03 15:46:03 crc kubenswrapper[4774]: E1003 15:46:03.300622 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:46:03 crc kubenswrapper[4774]: I1003 15:46:03.934825 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/cp-frr-files/0.log" Oct 03 15:46:03 crc kubenswrapper[4774]: I1003 15:46:03.994193 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/cp-metrics/0.log" Oct 03 15:46:04 crc kubenswrapper[4774]: I1003 15:46:04.024470 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/cp-reloader/0.log" Oct 03 15:46:04 crc kubenswrapper[4774]: I1003 15:46:04.026796 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/cp-reloader/0.log" Oct 03 15:46:04 crc kubenswrapper[4774]: I1003 15:46:04.185818 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/cp-frr-files/0.log" Oct 03 15:46:04 crc kubenswrapper[4774]: I1003 15:46:04.230744 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/cp-reloader/0.log" Oct 03 15:46:04 crc kubenswrapper[4774]: I1003 15:46:04.246574 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/cp-metrics/0.log" Oct 03 15:46:04 crc kubenswrapper[4774]: I1003 15:46:04.262506 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/cp-metrics/0.log" Oct 03 15:46:04 crc kubenswrapper[4774]: I1003 15:46:04.407891 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/cp-frr-files/0.log" Oct 03 15:46:04 crc kubenswrapper[4774]: I1003 15:46:04.433722 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/cp-reloader/0.log" Oct 03 15:46:04 crc kubenswrapper[4774]: I1003 15:46:04.485248 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/cp-metrics/0.log" Oct 03 15:46:04 crc kubenswrapper[4774]: I1003 15:46:04.488990 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/controller/0.log" Oct 03 15:46:04 crc kubenswrapper[4774]: I1003 15:46:04.623317 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/frr-metrics/0.log" Oct 03 15:46:04 crc kubenswrapper[4774]: I1003 15:46:04.727899 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/kube-rbac-proxy/0.log" Oct 03 15:46:04 crc kubenswrapper[4774]: I1003 15:46:04.758158 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/kube-rbac-proxy-frr/0.log" Oct 03 15:46:04 crc kubenswrapper[4774]: I1003 15:46:04.900186 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/reloader/0.log" Oct 03 15:46:04 crc kubenswrapper[4774]: I1003 15:46:04.988427 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-dtqfw_9c555a1f-be66-4efe-81ed-d2d90bd5e2f7/frr-k8s-webhook-server/0.log" Oct 03 15:46:05 crc kubenswrapper[4774]: I1003 15:46:05.450042 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7f6cc8bc96-sdmhf_87f70971-8510-4214-86dc-011aaf626b7a/manager/0.log" Oct 03 15:46:05 crc kubenswrapper[4774]: I1003 15:46:05.753733 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-696d9855d4-wkrd4_637291ee-7c78-4978-a59f-da3c5d284724/webhook-server/0.log" Oct 03 15:46:05 crc kubenswrapper[4774]: I1003 15:46:05.765799 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-xfjwm_e5a0c71d-7887-4a39-b427-221389fecc1e/kube-rbac-proxy/0.log" Oct 03 15:46:05 crc kubenswrapper[4774]: I1003 15:46:05.924737 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/frr/0.log" Oct 03 15:46:06 crc kubenswrapper[4774]: I1003 15:46:06.240344 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-xfjwm_e5a0c71d-7887-4a39-b427-221389fecc1e/speaker/0.log" Oct 03 15:46:17 crc kubenswrapper[4774]: I1003 15:46:17.300099 4774 scope.go:117] "RemoveContainer" containerID="6a9aac6dbc322b9090f9a59162378fc7732101d04a69222bfa4d228837707411" Oct 03 15:46:17 crc kubenswrapper[4774]: E1003 15:46:17.300801 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:46:18 crc kubenswrapper[4774]: I1003 15:46:18.758019 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m_53c45343-23b3-4606-a5e1-bdd4c43b2752/util/0.log" Oct 03 15:46:18 crc kubenswrapper[4774]: I1003 15:46:18.903082 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m_53c45343-23b3-4606-a5e1-bdd4c43b2752/pull/0.log" Oct 03 15:46:18 crc kubenswrapper[4774]: I1003 15:46:18.911927 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m_53c45343-23b3-4606-a5e1-bdd4c43b2752/util/0.log" Oct 03 15:46:18 crc kubenswrapper[4774]: I1003 15:46:18.929479 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m_53c45343-23b3-4606-a5e1-bdd4c43b2752/pull/0.log" Oct 03 15:46:19 crc kubenswrapper[4774]: I1003 15:46:19.050785 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m_53c45343-23b3-4606-a5e1-bdd4c43b2752/util/0.log" Oct 03 15:46:19 crc kubenswrapper[4774]: I1003 15:46:19.051629 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m_53c45343-23b3-4606-a5e1-bdd4c43b2752/pull/0.log" Oct 03 15:46:19 crc kubenswrapper[4774]: I1003 15:46:19.076598 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m_53c45343-23b3-4606-a5e1-bdd4c43b2752/extract/0.log" Oct 03 15:46:19 crc kubenswrapper[4774]: I1003 15:46:19.229610 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4knpn_a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c/extract-utilities/0.log" Oct 03 15:46:19 crc kubenswrapper[4774]: I1003 15:46:19.360215 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4knpn_a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c/extract-utilities/0.log" Oct 03 15:46:19 crc kubenswrapper[4774]: I1003 15:46:19.396756 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4knpn_a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c/extract-content/0.log" Oct 03 15:46:19 crc kubenswrapper[4774]: I1003 15:46:19.403257 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4knpn_a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c/extract-content/0.log" Oct 03 15:46:19 crc kubenswrapper[4774]: I1003 15:46:19.579453 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4knpn_a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c/extract-utilities/0.log" Oct 03 15:46:19 crc kubenswrapper[4774]: I1003 15:46:19.585406 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4knpn_a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c/extract-content/0.log" Oct 03 15:46:19 crc kubenswrapper[4774]: I1003 15:46:19.814122 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m7x75_8f0d7089-cd30-47db-a3cc-44492151e300/extract-utilities/0.log" Oct 03 15:46:20 crc kubenswrapper[4774]: I1003 15:46:20.029268 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m7x75_8f0d7089-cd30-47db-a3cc-44492151e300/extract-content/0.log" Oct 03 15:46:20 crc kubenswrapper[4774]: I1003 15:46:20.031124 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4knpn_a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c/registry-server/0.log" Oct 03 15:46:20 crc kubenswrapper[4774]: I1003 15:46:20.069966 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m7x75_8f0d7089-cd30-47db-a3cc-44492151e300/extract-content/0.log" Oct 03 15:46:20 crc kubenswrapper[4774]: I1003 15:46:20.079866 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m7x75_8f0d7089-cd30-47db-a3cc-44492151e300/extract-utilities/0.log" Oct 03 15:46:20 crc kubenswrapper[4774]: I1003 15:46:20.197321 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m7x75_8f0d7089-cd30-47db-a3cc-44492151e300/extract-utilities/0.log" Oct 03 15:46:20 crc kubenswrapper[4774]: I1003 15:46:20.210994 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m7x75_8f0d7089-cd30-47db-a3cc-44492151e300/extract-content/0.log" Oct 03 15:46:20 crc kubenswrapper[4774]: I1003 15:46:20.410048 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm_7845d8dc-4399-4450-b1d6-d424a8d64539/util/0.log" Oct 03 15:46:20 crc kubenswrapper[4774]: I1003 15:46:20.635134 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm_7845d8dc-4399-4450-b1d6-d424a8d64539/pull/0.log" Oct 03 15:46:20 crc kubenswrapper[4774]: I1003 15:46:20.645242 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm_7845d8dc-4399-4450-b1d6-d424a8d64539/util/0.log" Oct 03 15:46:20 crc kubenswrapper[4774]: I1003 15:46:20.685300 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm_7845d8dc-4399-4450-b1d6-d424a8d64539/pull/0.log" Oct 03 15:46:20 crc kubenswrapper[4774]: I1003 15:46:20.726766 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m7x75_8f0d7089-cd30-47db-a3cc-44492151e300/registry-server/0.log" Oct 03 15:46:20 crc kubenswrapper[4774]: I1003 15:46:20.853880 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm_7845d8dc-4399-4450-b1d6-d424a8d64539/pull/0.log" Oct 03 15:46:20 crc kubenswrapper[4774]: I1003 15:46:20.856109 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm_7845d8dc-4399-4450-b1d6-d424a8d64539/util/0.log" Oct 03 15:46:20 crc kubenswrapper[4774]: I1003 15:46:20.870860 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm_7845d8dc-4399-4450-b1d6-d424a8d64539/extract/0.log" Oct 03 15:46:21 crc kubenswrapper[4774]: I1003 15:46:21.032244 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-p2hrr_3fd11a50-e44d-4d7f-b301-6c7069bf6096/marketplace-operator/0.log" Oct 03 15:46:21 crc kubenswrapper[4774]: I1003 15:46:21.065878 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2gxlv_cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6/extract-utilities/0.log" Oct 03 15:46:21 crc kubenswrapper[4774]: I1003 15:46:21.239919 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2gxlv_cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6/extract-utilities/0.log" Oct 03 15:46:21 crc kubenswrapper[4774]: I1003 15:46:21.248661 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2gxlv_cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6/extract-content/0.log" Oct 03 15:46:21 crc kubenswrapper[4774]: I1003 15:46:21.255865 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2gxlv_cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6/extract-content/0.log" Oct 03 15:46:21 crc kubenswrapper[4774]: I1003 15:46:21.429864 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2gxlv_cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6/extract-utilities/0.log" Oct 03 15:46:21 crc kubenswrapper[4774]: I1003 15:46:21.430417 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2gxlv_cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6/extract-content/0.log" Oct 03 15:46:21 crc kubenswrapper[4774]: I1003 15:46:21.572761 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2gxlv_cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6/registry-server/0.log" Oct 03 15:46:21 crc kubenswrapper[4774]: I1003 15:46:21.608303 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mx9vw_f0c1612a-d998-4683-abf4-433f470c76b1/extract-utilities/0.log" Oct 03 15:46:21 crc kubenswrapper[4774]: I1003 15:46:21.775162 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mx9vw_f0c1612a-d998-4683-abf4-433f470c76b1/extract-content/0.log" Oct 03 15:46:21 crc kubenswrapper[4774]: I1003 15:46:21.787720 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mx9vw_f0c1612a-d998-4683-abf4-433f470c76b1/extract-utilities/0.log" Oct 03 15:46:21 crc kubenswrapper[4774]: I1003 15:46:21.796907 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mx9vw_f0c1612a-d998-4683-abf4-433f470c76b1/extract-content/0.log" Oct 03 15:46:21 crc kubenswrapper[4774]: I1003 15:46:21.975614 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mx9vw_f0c1612a-d998-4683-abf4-433f470c76b1/extract-content/0.log" Oct 03 15:46:22 crc kubenswrapper[4774]: I1003 15:46:22.046849 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mx9vw_f0c1612a-d998-4683-abf4-433f470c76b1/extract-utilities/0.log" Oct 03 15:46:22 crc kubenswrapper[4774]: I1003 15:46:22.331242 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mx9vw_f0c1612a-d998-4683-abf4-433f470c76b1/registry-server/0.log" Oct 03 15:46:29 crc kubenswrapper[4774]: I1003 15:46:29.311003 4774 scope.go:117] "RemoveContainer" containerID="6a9aac6dbc322b9090f9a59162378fc7732101d04a69222bfa4d228837707411" Oct 03 15:46:29 crc kubenswrapper[4774]: E1003 15:46:29.312147 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:46:41 crc kubenswrapper[4774]: I1003 15:46:41.300624 4774 scope.go:117] "RemoveContainer" containerID="6a9aac6dbc322b9090f9a59162378fc7732101d04a69222bfa4d228837707411" Oct 03 15:46:41 crc kubenswrapper[4774]: E1003 15:46:41.301434 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:46:51 crc kubenswrapper[4774]: E1003 15:46:51.654429 4774 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.32:42106->38.102.83.32:40969: write tcp 38.102.83.32:42106->38.102.83.32:40969: write: broken pipe Oct 03 15:46:55 crc kubenswrapper[4774]: I1003 15:46:55.311742 4774 scope.go:117] "RemoveContainer" containerID="6a9aac6dbc322b9090f9a59162378fc7732101d04a69222bfa4d228837707411" Oct 03 15:46:55 crc kubenswrapper[4774]: E1003 15:46:55.312956 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:47:10 crc kubenswrapper[4774]: I1003 15:47:10.299493 4774 scope.go:117] "RemoveContainer" containerID="6a9aac6dbc322b9090f9a59162378fc7732101d04a69222bfa4d228837707411" Oct 03 15:47:10 crc kubenswrapper[4774]: E1003 15:47:10.300246 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:47:19 crc kubenswrapper[4774]: I1003 15:47:19.655933 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l42js"] Oct 03 15:47:19 crc kubenswrapper[4774]: E1003 15:47:19.656971 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="498fbf9c-ad69-4822-a313-130c21ae73b1" containerName="collect-profiles" Oct 03 15:47:19 crc kubenswrapper[4774]: I1003 15:47:19.656988 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="498fbf9c-ad69-4822-a313-130c21ae73b1" containerName="collect-profiles" Oct 03 15:47:19 crc kubenswrapper[4774]: E1003 15:47:19.657019 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef2c9db9-b4bf-4c00-ab48-a43f8ae10bb3" containerName="container-00" Oct 03 15:47:19 crc kubenswrapper[4774]: I1003 15:47:19.657028 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef2c9db9-b4bf-4c00-ab48-a43f8ae10bb3" containerName="container-00" Oct 03 15:47:19 crc kubenswrapper[4774]: I1003 15:47:19.657248 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="498fbf9c-ad69-4822-a313-130c21ae73b1" containerName="collect-profiles" Oct 03 15:47:19 crc kubenswrapper[4774]: I1003 15:47:19.657264 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef2c9db9-b4bf-4c00-ab48-a43f8ae10bb3" containerName="container-00" Oct 03 15:47:19 crc kubenswrapper[4774]: I1003 15:47:19.658926 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l42js" Oct 03 15:47:19 crc kubenswrapper[4774]: I1003 15:47:19.675204 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l42js"] Oct 03 15:47:19 crc kubenswrapper[4774]: I1003 15:47:19.737052 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29f2c9de-4b00-4170-965e-8e0a9af453aa-catalog-content\") pod \"redhat-operators-l42js\" (UID: \"29f2c9de-4b00-4170-965e-8e0a9af453aa\") " pod="openshift-marketplace/redhat-operators-l42js" Oct 03 15:47:19 crc kubenswrapper[4774]: I1003 15:47:19.737452 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29f2c9de-4b00-4170-965e-8e0a9af453aa-utilities\") pod \"redhat-operators-l42js\" (UID: \"29f2c9de-4b00-4170-965e-8e0a9af453aa\") " pod="openshift-marketplace/redhat-operators-l42js" Oct 03 15:47:19 crc kubenswrapper[4774]: I1003 15:47:19.737544 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mbdf\" (UniqueName: \"kubernetes.io/projected/29f2c9de-4b00-4170-965e-8e0a9af453aa-kube-api-access-5mbdf\") pod \"redhat-operators-l42js\" (UID: \"29f2c9de-4b00-4170-965e-8e0a9af453aa\") " pod="openshift-marketplace/redhat-operators-l42js" Oct 03 15:47:19 crc kubenswrapper[4774]: I1003 15:47:19.838988 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29f2c9de-4b00-4170-965e-8e0a9af453aa-utilities\") pod \"redhat-operators-l42js\" (UID: \"29f2c9de-4b00-4170-965e-8e0a9af453aa\") " pod="openshift-marketplace/redhat-operators-l42js" Oct 03 15:47:19 crc kubenswrapper[4774]: I1003 15:47:19.839049 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mbdf\" (UniqueName: \"kubernetes.io/projected/29f2c9de-4b00-4170-965e-8e0a9af453aa-kube-api-access-5mbdf\") pod \"redhat-operators-l42js\" (UID: \"29f2c9de-4b00-4170-965e-8e0a9af453aa\") " pod="openshift-marketplace/redhat-operators-l42js" Oct 03 15:47:19 crc kubenswrapper[4774]: I1003 15:47:19.839074 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29f2c9de-4b00-4170-965e-8e0a9af453aa-catalog-content\") pod \"redhat-operators-l42js\" (UID: \"29f2c9de-4b00-4170-965e-8e0a9af453aa\") " pod="openshift-marketplace/redhat-operators-l42js" Oct 03 15:47:19 crc kubenswrapper[4774]: I1003 15:47:19.839658 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29f2c9de-4b00-4170-965e-8e0a9af453aa-utilities\") pod \"redhat-operators-l42js\" (UID: \"29f2c9de-4b00-4170-965e-8e0a9af453aa\") " pod="openshift-marketplace/redhat-operators-l42js" Oct 03 15:47:19 crc kubenswrapper[4774]: I1003 15:47:19.839699 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29f2c9de-4b00-4170-965e-8e0a9af453aa-catalog-content\") pod \"redhat-operators-l42js\" (UID: \"29f2c9de-4b00-4170-965e-8e0a9af453aa\") " pod="openshift-marketplace/redhat-operators-l42js" Oct 03 15:47:19 crc kubenswrapper[4774]: I1003 15:47:19.864487 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mbdf\" (UniqueName: \"kubernetes.io/projected/29f2c9de-4b00-4170-965e-8e0a9af453aa-kube-api-access-5mbdf\") pod \"redhat-operators-l42js\" (UID: \"29f2c9de-4b00-4170-965e-8e0a9af453aa\") " pod="openshift-marketplace/redhat-operators-l42js" Oct 03 15:47:19 crc kubenswrapper[4774]: I1003 15:47:19.995366 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l42js" Oct 03 15:47:20 crc kubenswrapper[4774]: I1003 15:47:20.559824 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l42js"] Oct 03 15:47:20 crc kubenswrapper[4774]: I1003 15:47:20.909468 4774 generic.go:334] "Generic (PLEG): container finished" podID="29f2c9de-4b00-4170-965e-8e0a9af453aa" containerID="3883fe6bb328b2c519d3a5092012714d36113014ecca776770c3d22fd6514854" exitCode=0 Oct 03 15:47:20 crc kubenswrapper[4774]: I1003 15:47:20.909571 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l42js" event={"ID":"29f2c9de-4b00-4170-965e-8e0a9af453aa","Type":"ContainerDied","Data":"3883fe6bb328b2c519d3a5092012714d36113014ecca776770c3d22fd6514854"} Oct 03 15:47:20 crc kubenswrapper[4774]: I1003 15:47:20.909808 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l42js" event={"ID":"29f2c9de-4b00-4170-965e-8e0a9af453aa","Type":"ContainerStarted","Data":"ecc444537d7faf27cc089a5f36903702ca3027880860f3936448ef1afe4d203b"} Oct 03 15:47:22 crc kubenswrapper[4774]: I1003 15:47:22.299947 4774 scope.go:117] "RemoveContainer" containerID="6a9aac6dbc322b9090f9a59162378fc7732101d04a69222bfa4d228837707411" Oct 03 15:47:22 crc kubenswrapper[4774]: E1003 15:47:22.300404 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:47:22 crc kubenswrapper[4774]: I1003 15:47:22.929429 4774 generic.go:334] "Generic (PLEG): container finished" podID="29f2c9de-4b00-4170-965e-8e0a9af453aa" containerID="961876c9e8ee05b2da40235b73be11dc08d764ee2f5a4ec081c905c3f484b30e" exitCode=0 Oct 03 15:47:22 crc kubenswrapper[4774]: I1003 15:47:22.929496 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l42js" event={"ID":"29f2c9de-4b00-4170-965e-8e0a9af453aa","Type":"ContainerDied","Data":"961876c9e8ee05b2da40235b73be11dc08d764ee2f5a4ec081c905c3f484b30e"} Oct 03 15:47:23 crc kubenswrapper[4774]: I1003 15:47:23.972045 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l42js" event={"ID":"29f2c9de-4b00-4170-965e-8e0a9af453aa","Type":"ContainerStarted","Data":"60523a3ca0f72e9b95a13e69a775a8e2705425a7269a83e8d6103e3747e83433"} Oct 03 15:47:24 crc kubenswrapper[4774]: I1003 15:47:24.023044 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l42js" podStartSLOduration=2.267552149 podStartE2EDuration="5.023022666s" podCreationTimestamp="2025-10-03 15:47:19 +0000 UTC" firstStartedPulling="2025-10-03 15:47:20.911000861 +0000 UTC m=+3863.500204313" lastFinishedPulling="2025-10-03 15:47:23.666471378 +0000 UTC m=+3866.255674830" observedRunningTime="2025-10-03 15:47:24.016203598 +0000 UTC m=+3866.605407070" watchObservedRunningTime="2025-10-03 15:47:24.023022666 +0000 UTC m=+3866.612226118" Oct 03 15:47:29 crc kubenswrapper[4774]: I1003 15:47:29.996513 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l42js" Oct 03 15:47:29 crc kubenswrapper[4774]: I1003 15:47:29.997169 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l42js" Oct 03 15:47:30 crc kubenswrapper[4774]: I1003 15:47:30.153721 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l42js" Oct 03 15:47:30 crc kubenswrapper[4774]: I1003 15:47:30.231194 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l42js" Oct 03 15:47:30 crc kubenswrapper[4774]: I1003 15:47:30.389206 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l42js"] Oct 03 15:47:32 crc kubenswrapper[4774]: I1003 15:47:32.047769 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l42js" podUID="29f2c9de-4b00-4170-965e-8e0a9af453aa" containerName="registry-server" containerID="cri-o://60523a3ca0f72e9b95a13e69a775a8e2705425a7269a83e8d6103e3747e83433" gracePeriod=2 Oct 03 15:47:32 crc kubenswrapper[4774]: I1003 15:47:32.522764 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l42js" Oct 03 15:47:32 crc kubenswrapper[4774]: I1003 15:47:32.593662 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29f2c9de-4b00-4170-965e-8e0a9af453aa-catalog-content\") pod \"29f2c9de-4b00-4170-965e-8e0a9af453aa\" (UID: \"29f2c9de-4b00-4170-965e-8e0a9af453aa\") " Oct 03 15:47:32 crc kubenswrapper[4774]: I1003 15:47:32.593729 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29f2c9de-4b00-4170-965e-8e0a9af453aa-utilities\") pod \"29f2c9de-4b00-4170-965e-8e0a9af453aa\" (UID: \"29f2c9de-4b00-4170-965e-8e0a9af453aa\") " Oct 03 15:47:32 crc kubenswrapper[4774]: I1003 15:47:32.593796 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mbdf\" (UniqueName: \"kubernetes.io/projected/29f2c9de-4b00-4170-965e-8e0a9af453aa-kube-api-access-5mbdf\") pod \"29f2c9de-4b00-4170-965e-8e0a9af453aa\" (UID: \"29f2c9de-4b00-4170-965e-8e0a9af453aa\") " Oct 03 15:47:32 crc kubenswrapper[4774]: I1003 15:47:32.594915 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29f2c9de-4b00-4170-965e-8e0a9af453aa-utilities" (OuterVolumeSpecName: "utilities") pod "29f2c9de-4b00-4170-965e-8e0a9af453aa" (UID: "29f2c9de-4b00-4170-965e-8e0a9af453aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:47:32 crc kubenswrapper[4774]: I1003 15:47:32.609551 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29f2c9de-4b00-4170-965e-8e0a9af453aa-kube-api-access-5mbdf" (OuterVolumeSpecName: "kube-api-access-5mbdf") pod "29f2c9de-4b00-4170-965e-8e0a9af453aa" (UID: "29f2c9de-4b00-4170-965e-8e0a9af453aa"). InnerVolumeSpecName "kube-api-access-5mbdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:47:32 crc kubenswrapper[4774]: I1003 15:47:32.678148 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29f2c9de-4b00-4170-965e-8e0a9af453aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "29f2c9de-4b00-4170-965e-8e0a9af453aa" (UID: "29f2c9de-4b00-4170-965e-8e0a9af453aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:47:32 crc kubenswrapper[4774]: I1003 15:47:32.696449 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29f2c9de-4b00-4170-965e-8e0a9af453aa-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 15:47:32 crc kubenswrapper[4774]: I1003 15:47:32.696501 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29f2c9de-4b00-4170-965e-8e0a9af453aa-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 15:47:32 crc kubenswrapper[4774]: I1003 15:47:32.696517 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mbdf\" (UniqueName: \"kubernetes.io/projected/29f2c9de-4b00-4170-965e-8e0a9af453aa-kube-api-access-5mbdf\") on node \"crc\" DevicePath \"\"" Oct 03 15:47:33 crc kubenswrapper[4774]: I1003 15:47:33.061938 4774 generic.go:334] "Generic (PLEG): container finished" podID="29f2c9de-4b00-4170-965e-8e0a9af453aa" containerID="60523a3ca0f72e9b95a13e69a775a8e2705425a7269a83e8d6103e3747e83433" exitCode=0 Oct 03 15:47:33 crc kubenswrapper[4774]: I1003 15:47:33.061995 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l42js" event={"ID":"29f2c9de-4b00-4170-965e-8e0a9af453aa","Type":"ContainerDied","Data":"60523a3ca0f72e9b95a13e69a775a8e2705425a7269a83e8d6103e3747e83433"} Oct 03 15:47:33 crc kubenswrapper[4774]: I1003 15:47:33.062030 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l42js" event={"ID":"29f2c9de-4b00-4170-965e-8e0a9af453aa","Type":"ContainerDied","Data":"ecc444537d7faf27cc089a5f36903702ca3027880860f3936448ef1afe4d203b"} Oct 03 15:47:33 crc kubenswrapper[4774]: I1003 15:47:33.062035 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l42js" Oct 03 15:47:33 crc kubenswrapper[4774]: I1003 15:47:33.062057 4774 scope.go:117] "RemoveContainer" containerID="60523a3ca0f72e9b95a13e69a775a8e2705425a7269a83e8d6103e3747e83433" Oct 03 15:47:33 crc kubenswrapper[4774]: I1003 15:47:33.091609 4774 scope.go:117] "RemoveContainer" containerID="961876c9e8ee05b2da40235b73be11dc08d764ee2f5a4ec081c905c3f484b30e" Oct 03 15:47:33 crc kubenswrapper[4774]: I1003 15:47:33.116890 4774 scope.go:117] "RemoveContainer" containerID="3883fe6bb328b2c519d3a5092012714d36113014ecca776770c3d22fd6514854" Oct 03 15:47:33 crc kubenswrapper[4774]: I1003 15:47:33.128873 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l42js"] Oct 03 15:47:33 crc kubenswrapper[4774]: I1003 15:47:33.136853 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l42js"] Oct 03 15:47:33 crc kubenswrapper[4774]: I1003 15:47:33.165071 4774 scope.go:117] "RemoveContainer" containerID="60523a3ca0f72e9b95a13e69a775a8e2705425a7269a83e8d6103e3747e83433" Oct 03 15:47:33 crc kubenswrapper[4774]: E1003 15:47:33.165527 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60523a3ca0f72e9b95a13e69a775a8e2705425a7269a83e8d6103e3747e83433\": container with ID starting with 60523a3ca0f72e9b95a13e69a775a8e2705425a7269a83e8d6103e3747e83433 not found: ID does not exist" containerID="60523a3ca0f72e9b95a13e69a775a8e2705425a7269a83e8d6103e3747e83433" Oct 03 15:47:33 crc kubenswrapper[4774]: I1003 15:47:33.165568 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60523a3ca0f72e9b95a13e69a775a8e2705425a7269a83e8d6103e3747e83433"} err="failed to get container status \"60523a3ca0f72e9b95a13e69a775a8e2705425a7269a83e8d6103e3747e83433\": rpc error: code = NotFound desc = could not find container \"60523a3ca0f72e9b95a13e69a775a8e2705425a7269a83e8d6103e3747e83433\": container with ID starting with 60523a3ca0f72e9b95a13e69a775a8e2705425a7269a83e8d6103e3747e83433 not found: ID does not exist" Oct 03 15:47:33 crc kubenswrapper[4774]: I1003 15:47:33.165594 4774 scope.go:117] "RemoveContainer" containerID="961876c9e8ee05b2da40235b73be11dc08d764ee2f5a4ec081c905c3f484b30e" Oct 03 15:47:33 crc kubenswrapper[4774]: E1003 15:47:33.166018 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"961876c9e8ee05b2da40235b73be11dc08d764ee2f5a4ec081c905c3f484b30e\": container with ID starting with 961876c9e8ee05b2da40235b73be11dc08d764ee2f5a4ec081c905c3f484b30e not found: ID does not exist" containerID="961876c9e8ee05b2da40235b73be11dc08d764ee2f5a4ec081c905c3f484b30e" Oct 03 15:47:33 crc kubenswrapper[4774]: I1003 15:47:33.166050 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"961876c9e8ee05b2da40235b73be11dc08d764ee2f5a4ec081c905c3f484b30e"} err="failed to get container status \"961876c9e8ee05b2da40235b73be11dc08d764ee2f5a4ec081c905c3f484b30e\": rpc error: code = NotFound desc = could not find container \"961876c9e8ee05b2da40235b73be11dc08d764ee2f5a4ec081c905c3f484b30e\": container with ID starting with 961876c9e8ee05b2da40235b73be11dc08d764ee2f5a4ec081c905c3f484b30e not found: ID does not exist" Oct 03 15:47:33 crc kubenswrapper[4774]: I1003 15:47:33.166068 4774 scope.go:117] "RemoveContainer" containerID="3883fe6bb328b2c519d3a5092012714d36113014ecca776770c3d22fd6514854" Oct 03 15:47:33 crc kubenswrapper[4774]: E1003 15:47:33.166325 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3883fe6bb328b2c519d3a5092012714d36113014ecca776770c3d22fd6514854\": container with ID starting with 3883fe6bb328b2c519d3a5092012714d36113014ecca776770c3d22fd6514854 not found: ID does not exist" containerID="3883fe6bb328b2c519d3a5092012714d36113014ecca776770c3d22fd6514854" Oct 03 15:47:33 crc kubenswrapper[4774]: I1003 15:47:33.166353 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3883fe6bb328b2c519d3a5092012714d36113014ecca776770c3d22fd6514854"} err="failed to get container status \"3883fe6bb328b2c519d3a5092012714d36113014ecca776770c3d22fd6514854\": rpc error: code = NotFound desc = could not find container \"3883fe6bb328b2c519d3a5092012714d36113014ecca776770c3d22fd6514854\": container with ID starting with 3883fe6bb328b2c519d3a5092012714d36113014ecca776770c3d22fd6514854 not found: ID does not exist" Oct 03 15:47:33 crc kubenswrapper[4774]: I1003 15:47:33.355362 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29f2c9de-4b00-4170-965e-8e0a9af453aa" path="/var/lib/kubelet/pods/29f2c9de-4b00-4170-965e-8e0a9af453aa/volumes" Oct 03 15:47:37 crc kubenswrapper[4774]: I1003 15:47:37.300596 4774 scope.go:117] "RemoveContainer" containerID="6a9aac6dbc322b9090f9a59162378fc7732101d04a69222bfa4d228837707411" Oct 03 15:47:37 crc kubenswrapper[4774]: E1003 15:47:37.301457 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:47:50 crc kubenswrapper[4774]: I1003 15:47:50.300611 4774 scope.go:117] "RemoveContainer" containerID="6a9aac6dbc322b9090f9a59162378fc7732101d04a69222bfa4d228837707411" Oct 03 15:47:50 crc kubenswrapper[4774]: E1003 15:47:50.301457 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:48:02 crc kubenswrapper[4774]: I1003 15:48:02.299667 4774 scope.go:117] "RemoveContainer" containerID="6a9aac6dbc322b9090f9a59162378fc7732101d04a69222bfa4d228837707411" Oct 03 15:48:02 crc kubenswrapper[4774]: E1003 15:48:02.300571 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:48:17 crc kubenswrapper[4774]: I1003 15:48:17.300128 4774 scope.go:117] "RemoveContainer" containerID="6a9aac6dbc322b9090f9a59162378fc7732101d04a69222bfa4d228837707411" Oct 03 15:48:17 crc kubenswrapper[4774]: E1003 15:48:17.301353 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:48:20 crc kubenswrapper[4774]: I1003 15:48:20.545796 4774 generic.go:334] "Generic (PLEG): container finished" podID="9153c48d-9951-41c0-b790-8f44aa0c8e77" containerID="464edeeac6284cf36415fec88fb62e4acdc0daa5bbc59f7677d45d09146e3778" exitCode=0 Oct 03 15:48:20 crc kubenswrapper[4774]: I1003 15:48:20.545870 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r575n/must-gather-7pk24" event={"ID":"9153c48d-9951-41c0-b790-8f44aa0c8e77","Type":"ContainerDied","Data":"464edeeac6284cf36415fec88fb62e4acdc0daa5bbc59f7677d45d09146e3778"} Oct 03 15:48:20 crc kubenswrapper[4774]: I1003 15:48:20.546978 4774 scope.go:117] "RemoveContainer" containerID="464edeeac6284cf36415fec88fb62e4acdc0daa5bbc59f7677d45d09146e3778" Oct 03 15:48:21 crc kubenswrapper[4774]: I1003 15:48:21.593275 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-r575n_must-gather-7pk24_9153c48d-9951-41c0-b790-8f44aa0c8e77/gather/0.log" Oct 03 15:48:29 crc kubenswrapper[4774]: I1003 15:48:29.071201 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-r575n/must-gather-7pk24"] Oct 03 15:48:29 crc kubenswrapper[4774]: I1003 15:48:29.071836 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-r575n/must-gather-7pk24"] Oct 03 15:48:29 crc kubenswrapper[4774]: I1003 15:48:29.072073 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-r575n/must-gather-7pk24" podUID="9153c48d-9951-41c0-b790-8f44aa0c8e77" containerName="copy" containerID="cri-o://10eee216ac078c5911001b6a5c610126818015502066578145b170be85aab961" gracePeriod=2 Oct 03 15:48:29 crc kubenswrapper[4774]: I1003 15:48:29.568880 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-r575n_must-gather-7pk24_9153c48d-9951-41c0-b790-8f44aa0c8e77/copy/0.log" Oct 03 15:48:29 crc kubenswrapper[4774]: I1003 15:48:29.569483 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r575n/must-gather-7pk24" Oct 03 15:48:29 crc kubenswrapper[4774]: I1003 15:48:29.644068 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-r575n_must-gather-7pk24_9153c48d-9951-41c0-b790-8f44aa0c8e77/copy/0.log" Oct 03 15:48:29 crc kubenswrapper[4774]: I1003 15:48:29.644537 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndgkc\" (UniqueName: \"kubernetes.io/projected/9153c48d-9951-41c0-b790-8f44aa0c8e77-kube-api-access-ndgkc\") pod \"9153c48d-9951-41c0-b790-8f44aa0c8e77\" (UID: \"9153c48d-9951-41c0-b790-8f44aa0c8e77\") " Oct 03 15:48:29 crc kubenswrapper[4774]: I1003 15:48:29.644635 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9153c48d-9951-41c0-b790-8f44aa0c8e77-must-gather-output\") pod \"9153c48d-9951-41c0-b790-8f44aa0c8e77\" (UID: \"9153c48d-9951-41c0-b790-8f44aa0c8e77\") " Oct 03 15:48:29 crc kubenswrapper[4774]: I1003 15:48:29.644744 4774 generic.go:334] "Generic (PLEG): container finished" podID="9153c48d-9951-41c0-b790-8f44aa0c8e77" containerID="10eee216ac078c5911001b6a5c610126818015502066578145b170be85aab961" exitCode=143 Oct 03 15:48:29 crc kubenswrapper[4774]: I1003 15:48:29.644863 4774 scope.go:117] "RemoveContainer" containerID="10eee216ac078c5911001b6a5c610126818015502066578145b170be85aab961" Oct 03 15:48:29 crc kubenswrapper[4774]: I1003 15:48:29.645048 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r575n/must-gather-7pk24" Oct 03 15:48:29 crc kubenswrapper[4774]: I1003 15:48:29.667298 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9153c48d-9951-41c0-b790-8f44aa0c8e77-kube-api-access-ndgkc" (OuterVolumeSpecName: "kube-api-access-ndgkc") pod "9153c48d-9951-41c0-b790-8f44aa0c8e77" (UID: "9153c48d-9951-41c0-b790-8f44aa0c8e77"). InnerVolumeSpecName "kube-api-access-ndgkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:48:29 crc kubenswrapper[4774]: I1003 15:48:29.718289 4774 scope.go:117] "RemoveContainer" containerID="464edeeac6284cf36415fec88fb62e4acdc0daa5bbc59f7677d45d09146e3778" Oct 03 15:48:29 crc kubenswrapper[4774]: I1003 15:48:29.746587 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndgkc\" (UniqueName: \"kubernetes.io/projected/9153c48d-9951-41c0-b790-8f44aa0c8e77-kube-api-access-ndgkc\") on node \"crc\" DevicePath \"\"" Oct 03 15:48:29 crc kubenswrapper[4774]: I1003 15:48:29.800287 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9153c48d-9951-41c0-b790-8f44aa0c8e77-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "9153c48d-9951-41c0-b790-8f44aa0c8e77" (UID: "9153c48d-9951-41c0-b790-8f44aa0c8e77"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:48:29 crc kubenswrapper[4774]: I1003 15:48:29.809303 4774 scope.go:117] "RemoveContainer" containerID="10eee216ac078c5911001b6a5c610126818015502066578145b170be85aab961" Oct 03 15:48:29 crc kubenswrapper[4774]: E1003 15:48:29.809900 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10eee216ac078c5911001b6a5c610126818015502066578145b170be85aab961\": container with ID starting with 10eee216ac078c5911001b6a5c610126818015502066578145b170be85aab961 not found: ID does not exist" containerID="10eee216ac078c5911001b6a5c610126818015502066578145b170be85aab961" Oct 03 15:48:29 crc kubenswrapper[4774]: I1003 15:48:29.809937 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10eee216ac078c5911001b6a5c610126818015502066578145b170be85aab961"} err="failed to get container status \"10eee216ac078c5911001b6a5c610126818015502066578145b170be85aab961\": rpc error: code = NotFound desc = could not find container \"10eee216ac078c5911001b6a5c610126818015502066578145b170be85aab961\": container with ID starting with 10eee216ac078c5911001b6a5c610126818015502066578145b170be85aab961 not found: ID does not exist" Oct 03 15:48:29 crc kubenswrapper[4774]: I1003 15:48:29.809962 4774 scope.go:117] "RemoveContainer" containerID="464edeeac6284cf36415fec88fb62e4acdc0daa5bbc59f7677d45d09146e3778" Oct 03 15:48:29 crc kubenswrapper[4774]: E1003 15:48:29.811274 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"464edeeac6284cf36415fec88fb62e4acdc0daa5bbc59f7677d45d09146e3778\": container with ID starting with 464edeeac6284cf36415fec88fb62e4acdc0daa5bbc59f7677d45d09146e3778 not found: ID does not exist" containerID="464edeeac6284cf36415fec88fb62e4acdc0daa5bbc59f7677d45d09146e3778" Oct 03 15:48:29 crc kubenswrapper[4774]: I1003 15:48:29.811334 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"464edeeac6284cf36415fec88fb62e4acdc0daa5bbc59f7677d45d09146e3778"} err="failed to get container status \"464edeeac6284cf36415fec88fb62e4acdc0daa5bbc59f7677d45d09146e3778\": rpc error: code = NotFound desc = could not find container \"464edeeac6284cf36415fec88fb62e4acdc0daa5bbc59f7677d45d09146e3778\": container with ID starting with 464edeeac6284cf36415fec88fb62e4acdc0daa5bbc59f7677d45d09146e3778 not found: ID does not exist" Oct 03 15:48:29 crc kubenswrapper[4774]: I1003 15:48:29.849126 4774 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9153c48d-9951-41c0-b790-8f44aa0c8e77-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 03 15:48:30 crc kubenswrapper[4774]: I1003 15:48:30.300434 4774 scope.go:117] "RemoveContainer" containerID="6a9aac6dbc322b9090f9a59162378fc7732101d04a69222bfa4d228837707411" Oct 03 15:48:30 crc kubenswrapper[4774]: E1003 15:48:30.301124 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:48:31 crc kubenswrapper[4774]: I1003 15:48:31.312858 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9153c48d-9951-41c0-b790-8f44aa0c8e77" path="/var/lib/kubelet/pods/9153c48d-9951-41c0-b790-8f44aa0c8e77/volumes" Oct 03 15:48:42 crc kubenswrapper[4774]: I1003 15:48:42.299811 4774 scope.go:117] "RemoveContainer" containerID="6a9aac6dbc322b9090f9a59162378fc7732101d04a69222bfa4d228837707411" Oct 03 15:48:42 crc kubenswrapper[4774]: E1003 15:48:42.300737 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:48:54 crc kubenswrapper[4774]: I1003 15:48:54.299571 4774 scope.go:117] "RemoveContainer" containerID="6a9aac6dbc322b9090f9a59162378fc7732101d04a69222bfa4d228837707411" Oct 03 15:48:54 crc kubenswrapper[4774]: E1003 15:48:54.300346 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:49:05 crc kubenswrapper[4774]: I1003 15:49:05.300162 4774 scope.go:117] "RemoveContainer" containerID="6a9aac6dbc322b9090f9a59162378fc7732101d04a69222bfa4d228837707411" Oct 03 15:49:05 crc kubenswrapper[4774]: E1003 15:49:05.301887 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:49:08 crc kubenswrapper[4774]: I1003 15:49:08.806992 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2968w/must-gather-ftqsx"] Oct 03 15:49:08 crc kubenswrapper[4774]: E1003 15:49:08.808761 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29f2c9de-4b00-4170-965e-8e0a9af453aa" containerName="extract-content" Oct 03 15:49:08 crc kubenswrapper[4774]: I1003 15:49:08.808784 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="29f2c9de-4b00-4170-965e-8e0a9af453aa" containerName="extract-content" Oct 03 15:49:08 crc kubenswrapper[4774]: E1003 15:49:08.808805 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9153c48d-9951-41c0-b790-8f44aa0c8e77" containerName="gather" Oct 03 15:49:08 crc kubenswrapper[4774]: I1003 15:49:08.808815 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="9153c48d-9951-41c0-b790-8f44aa0c8e77" containerName="gather" Oct 03 15:49:08 crc kubenswrapper[4774]: E1003 15:49:08.808834 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29f2c9de-4b00-4170-965e-8e0a9af453aa" containerName="extract-utilities" Oct 03 15:49:08 crc kubenswrapper[4774]: I1003 15:49:08.808844 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="29f2c9de-4b00-4170-965e-8e0a9af453aa" containerName="extract-utilities" Oct 03 15:49:08 crc kubenswrapper[4774]: E1003 15:49:08.808858 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9153c48d-9951-41c0-b790-8f44aa0c8e77" containerName="copy" Oct 03 15:49:08 crc kubenswrapper[4774]: I1003 15:49:08.808866 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="9153c48d-9951-41c0-b790-8f44aa0c8e77" containerName="copy" Oct 03 15:49:08 crc kubenswrapper[4774]: E1003 15:49:08.808881 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29f2c9de-4b00-4170-965e-8e0a9af453aa" containerName="registry-server" Oct 03 15:49:08 crc kubenswrapper[4774]: I1003 15:49:08.808889 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="29f2c9de-4b00-4170-965e-8e0a9af453aa" containerName="registry-server" Oct 03 15:49:08 crc kubenswrapper[4774]: I1003 15:49:08.809116 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="9153c48d-9951-41c0-b790-8f44aa0c8e77" containerName="gather" Oct 03 15:49:08 crc kubenswrapper[4774]: I1003 15:49:08.809131 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="9153c48d-9951-41c0-b790-8f44aa0c8e77" containerName="copy" Oct 03 15:49:08 crc kubenswrapper[4774]: I1003 15:49:08.809152 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="29f2c9de-4b00-4170-965e-8e0a9af453aa" containerName="registry-server" Oct 03 15:49:08 crc kubenswrapper[4774]: I1003 15:49:08.810401 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2968w/must-gather-ftqsx" Oct 03 15:49:08 crc kubenswrapper[4774]: I1003 15:49:08.812623 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-2968w"/"openshift-service-ca.crt" Oct 03 15:49:08 crc kubenswrapper[4774]: I1003 15:49:08.813597 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-2968w"/"kube-root-ca.crt" Oct 03 15:49:08 crc kubenswrapper[4774]: I1003 15:49:08.828903 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2968w/must-gather-ftqsx"] Oct 03 15:49:08 crc kubenswrapper[4774]: I1003 15:49:08.953182 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d6bade7f-119a-4b23-bbf8-27860297b296-must-gather-output\") pod \"must-gather-ftqsx\" (UID: \"d6bade7f-119a-4b23-bbf8-27860297b296\") " pod="openshift-must-gather-2968w/must-gather-ftqsx" Oct 03 15:49:08 crc kubenswrapper[4774]: I1003 15:49:08.953354 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28hrx\" (UniqueName: \"kubernetes.io/projected/d6bade7f-119a-4b23-bbf8-27860297b296-kube-api-access-28hrx\") pod \"must-gather-ftqsx\" (UID: \"d6bade7f-119a-4b23-bbf8-27860297b296\") " pod="openshift-must-gather-2968w/must-gather-ftqsx" Oct 03 15:49:09 crc kubenswrapper[4774]: I1003 15:49:09.055415 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d6bade7f-119a-4b23-bbf8-27860297b296-must-gather-output\") pod \"must-gather-ftqsx\" (UID: \"d6bade7f-119a-4b23-bbf8-27860297b296\") " pod="openshift-must-gather-2968w/must-gather-ftqsx" Oct 03 15:49:09 crc kubenswrapper[4774]: I1003 15:49:09.055515 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28hrx\" (UniqueName: \"kubernetes.io/projected/d6bade7f-119a-4b23-bbf8-27860297b296-kube-api-access-28hrx\") pod \"must-gather-ftqsx\" (UID: \"d6bade7f-119a-4b23-bbf8-27860297b296\") " pod="openshift-must-gather-2968w/must-gather-ftqsx" Oct 03 15:49:09 crc kubenswrapper[4774]: I1003 15:49:09.056164 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d6bade7f-119a-4b23-bbf8-27860297b296-must-gather-output\") pod \"must-gather-ftqsx\" (UID: \"d6bade7f-119a-4b23-bbf8-27860297b296\") " pod="openshift-must-gather-2968w/must-gather-ftqsx" Oct 03 15:49:09 crc kubenswrapper[4774]: I1003 15:49:09.077130 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28hrx\" (UniqueName: \"kubernetes.io/projected/d6bade7f-119a-4b23-bbf8-27860297b296-kube-api-access-28hrx\") pod \"must-gather-ftqsx\" (UID: \"d6bade7f-119a-4b23-bbf8-27860297b296\") " pod="openshift-must-gather-2968w/must-gather-ftqsx" Oct 03 15:49:09 crc kubenswrapper[4774]: I1003 15:49:09.129323 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2968w/must-gather-ftqsx" Oct 03 15:49:09 crc kubenswrapper[4774]: I1003 15:49:09.621793 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2968w/must-gather-ftqsx"] Oct 03 15:49:10 crc kubenswrapper[4774]: I1003 15:49:10.064716 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2968w/must-gather-ftqsx" event={"ID":"d6bade7f-119a-4b23-bbf8-27860297b296","Type":"ContainerStarted","Data":"832d8e112054844bc36903eeb577129ed6bcfbfa04d6e91c7a72367d9b80d29a"} Oct 03 15:49:10 crc kubenswrapper[4774]: I1003 15:49:10.065090 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2968w/must-gather-ftqsx" event={"ID":"d6bade7f-119a-4b23-bbf8-27860297b296","Type":"ContainerStarted","Data":"a1eca720c7ec09680f15761b27b17ae036e75ced0465153ca9ddf29efc8b6754"} Oct 03 15:49:11 crc kubenswrapper[4774]: I1003 15:49:11.075972 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2968w/must-gather-ftqsx" event={"ID":"d6bade7f-119a-4b23-bbf8-27860297b296","Type":"ContainerStarted","Data":"935050b4acabd37437c5ca63cd170014713d0b0231d78ab6811fef5a1e5692fb"} Oct 03 15:49:11 crc kubenswrapper[4774]: I1003 15:49:11.099827 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2968w/must-gather-ftqsx" podStartSLOduration=3.099806426 podStartE2EDuration="3.099806426s" podCreationTimestamp="2025-10-03 15:49:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:49:11.096109254 +0000 UTC m=+3973.685312716" watchObservedRunningTime="2025-10-03 15:49:11.099806426 +0000 UTC m=+3973.689009878" Oct 03 15:49:13 crc kubenswrapper[4774]: I1003 15:49:13.252233 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2968w/crc-debug-zhtcc"] Oct 03 15:49:13 crc kubenswrapper[4774]: I1003 15:49:13.253835 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2968w/crc-debug-zhtcc" Oct 03 15:49:13 crc kubenswrapper[4774]: I1003 15:49:13.255885 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-2968w"/"default-dockercfg-vcgvt" Oct 03 15:49:13 crc kubenswrapper[4774]: I1003 15:49:13.439733 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5qxc\" (UniqueName: \"kubernetes.io/projected/595ae358-51f7-4e1f-830d-a21614c7726c-kube-api-access-f5qxc\") pod \"crc-debug-zhtcc\" (UID: \"595ae358-51f7-4e1f-830d-a21614c7726c\") " pod="openshift-must-gather-2968w/crc-debug-zhtcc" Oct 03 15:49:13 crc kubenswrapper[4774]: I1003 15:49:13.439812 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/595ae358-51f7-4e1f-830d-a21614c7726c-host\") pod \"crc-debug-zhtcc\" (UID: \"595ae358-51f7-4e1f-830d-a21614c7726c\") " pod="openshift-must-gather-2968w/crc-debug-zhtcc" Oct 03 15:49:13 crc kubenswrapper[4774]: I1003 15:49:13.542640 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5qxc\" (UniqueName: \"kubernetes.io/projected/595ae358-51f7-4e1f-830d-a21614c7726c-kube-api-access-f5qxc\") pod \"crc-debug-zhtcc\" (UID: \"595ae358-51f7-4e1f-830d-a21614c7726c\") " pod="openshift-must-gather-2968w/crc-debug-zhtcc" Oct 03 15:49:13 crc kubenswrapper[4774]: I1003 15:49:13.542728 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/595ae358-51f7-4e1f-830d-a21614c7726c-host\") pod \"crc-debug-zhtcc\" (UID: \"595ae358-51f7-4e1f-830d-a21614c7726c\") " pod="openshift-must-gather-2968w/crc-debug-zhtcc" Oct 03 15:49:13 crc kubenswrapper[4774]: I1003 15:49:13.543032 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/595ae358-51f7-4e1f-830d-a21614c7726c-host\") pod \"crc-debug-zhtcc\" (UID: \"595ae358-51f7-4e1f-830d-a21614c7726c\") " pod="openshift-must-gather-2968w/crc-debug-zhtcc" Oct 03 15:49:13 crc kubenswrapper[4774]: I1003 15:49:13.566718 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5qxc\" (UniqueName: \"kubernetes.io/projected/595ae358-51f7-4e1f-830d-a21614c7726c-kube-api-access-f5qxc\") pod \"crc-debug-zhtcc\" (UID: \"595ae358-51f7-4e1f-830d-a21614c7726c\") " pod="openshift-must-gather-2968w/crc-debug-zhtcc" Oct 03 15:49:13 crc kubenswrapper[4774]: I1003 15:49:13.582552 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2968w/crc-debug-zhtcc" Oct 03 15:49:13 crc kubenswrapper[4774]: W1003 15:49:13.609915 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod595ae358_51f7_4e1f_830d_a21614c7726c.slice/crio-18a818b00b83d0c659bb21bab4655b53706935ae5992a8a260cc8ad5fedceb18 WatchSource:0}: Error finding container 18a818b00b83d0c659bb21bab4655b53706935ae5992a8a260cc8ad5fedceb18: Status 404 returned error can't find the container with id 18a818b00b83d0c659bb21bab4655b53706935ae5992a8a260cc8ad5fedceb18 Oct 03 15:49:14 crc kubenswrapper[4774]: I1003 15:49:14.102796 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2968w/crc-debug-zhtcc" event={"ID":"595ae358-51f7-4e1f-830d-a21614c7726c","Type":"ContainerStarted","Data":"0f125f54ee63c120a173107e0f934ef6dfaec0b7ef3cb99aceb30884c4e5a1ef"} Oct 03 15:49:14 crc kubenswrapper[4774]: I1003 15:49:14.103359 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2968w/crc-debug-zhtcc" event={"ID":"595ae358-51f7-4e1f-830d-a21614c7726c","Type":"ContainerStarted","Data":"18a818b00b83d0c659bb21bab4655b53706935ae5992a8a260cc8ad5fedceb18"} Oct 03 15:49:14 crc kubenswrapper[4774]: I1003 15:49:14.124396 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2968w/crc-debug-zhtcc" podStartSLOduration=1.124363575 podStartE2EDuration="1.124363575s" podCreationTimestamp="2025-10-03 15:49:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:49:14.118450038 +0000 UTC m=+3976.707653490" watchObservedRunningTime="2025-10-03 15:49:14.124363575 +0000 UTC m=+3976.713567027" Oct 03 15:49:17 crc kubenswrapper[4774]: I1003 15:49:17.299690 4774 scope.go:117] "RemoveContainer" containerID="6a9aac6dbc322b9090f9a59162378fc7732101d04a69222bfa4d228837707411" Oct 03 15:49:17 crc kubenswrapper[4774]: E1003 15:49:17.300817 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:49:30 crc kubenswrapper[4774]: I1003 15:49:30.300449 4774 scope.go:117] "RemoveContainer" containerID="6a9aac6dbc322b9090f9a59162378fc7732101d04a69222bfa4d228837707411" Oct 03 15:49:30 crc kubenswrapper[4774]: E1003 15:49:30.301235 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:49:41 crc kubenswrapper[4774]: I1003 15:49:41.098319 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-425fn"] Oct 03 15:49:41 crc kubenswrapper[4774]: I1003 15:49:41.101327 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-425fn" Oct 03 15:49:41 crc kubenswrapper[4774]: I1003 15:49:41.115715 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-425fn"] Oct 03 15:49:41 crc kubenswrapper[4774]: I1003 15:49:41.180758 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64b2j\" (UniqueName: \"kubernetes.io/projected/a5b55b99-f8ef-4f35-9b44-da46e01ee2eb-kube-api-access-64b2j\") pod \"certified-operators-425fn\" (UID: \"a5b55b99-f8ef-4f35-9b44-da46e01ee2eb\") " pod="openshift-marketplace/certified-operators-425fn" Oct 03 15:49:41 crc kubenswrapper[4774]: I1003 15:49:41.180835 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5b55b99-f8ef-4f35-9b44-da46e01ee2eb-utilities\") pod \"certified-operators-425fn\" (UID: \"a5b55b99-f8ef-4f35-9b44-da46e01ee2eb\") " pod="openshift-marketplace/certified-operators-425fn" Oct 03 15:49:41 crc kubenswrapper[4774]: I1003 15:49:41.180867 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5b55b99-f8ef-4f35-9b44-da46e01ee2eb-catalog-content\") pod \"certified-operators-425fn\" (UID: \"a5b55b99-f8ef-4f35-9b44-da46e01ee2eb\") " pod="openshift-marketplace/certified-operators-425fn" Oct 03 15:49:41 crc kubenswrapper[4774]: I1003 15:49:41.282484 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5b55b99-f8ef-4f35-9b44-da46e01ee2eb-utilities\") pod \"certified-operators-425fn\" (UID: \"a5b55b99-f8ef-4f35-9b44-da46e01ee2eb\") " pod="openshift-marketplace/certified-operators-425fn" Oct 03 15:49:41 crc kubenswrapper[4774]: I1003 15:49:41.282545 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5b55b99-f8ef-4f35-9b44-da46e01ee2eb-catalog-content\") pod \"certified-operators-425fn\" (UID: \"a5b55b99-f8ef-4f35-9b44-da46e01ee2eb\") " pod="openshift-marketplace/certified-operators-425fn" Oct 03 15:49:41 crc kubenswrapper[4774]: I1003 15:49:41.282688 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64b2j\" (UniqueName: \"kubernetes.io/projected/a5b55b99-f8ef-4f35-9b44-da46e01ee2eb-kube-api-access-64b2j\") pod \"certified-operators-425fn\" (UID: \"a5b55b99-f8ef-4f35-9b44-da46e01ee2eb\") " pod="openshift-marketplace/certified-operators-425fn" Oct 03 15:49:41 crc kubenswrapper[4774]: I1003 15:49:41.283585 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5b55b99-f8ef-4f35-9b44-da46e01ee2eb-utilities\") pod \"certified-operators-425fn\" (UID: \"a5b55b99-f8ef-4f35-9b44-da46e01ee2eb\") " pod="openshift-marketplace/certified-operators-425fn" Oct 03 15:49:41 crc kubenswrapper[4774]: I1003 15:49:41.283861 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5b55b99-f8ef-4f35-9b44-da46e01ee2eb-catalog-content\") pod \"certified-operators-425fn\" (UID: \"a5b55b99-f8ef-4f35-9b44-da46e01ee2eb\") " pod="openshift-marketplace/certified-operators-425fn" Oct 03 15:49:41 crc kubenswrapper[4774]: I1003 15:49:41.285299 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r8mcw"] Oct 03 15:49:41 crc kubenswrapper[4774]: I1003 15:49:41.287164 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r8mcw" Oct 03 15:49:41 crc kubenswrapper[4774]: I1003 15:49:41.325254 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64b2j\" (UniqueName: \"kubernetes.io/projected/a5b55b99-f8ef-4f35-9b44-da46e01ee2eb-kube-api-access-64b2j\") pod \"certified-operators-425fn\" (UID: \"a5b55b99-f8ef-4f35-9b44-da46e01ee2eb\") " pod="openshift-marketplace/certified-operators-425fn" Oct 03 15:49:41 crc kubenswrapper[4774]: I1003 15:49:41.334123 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r8mcw"] Oct 03 15:49:41 crc kubenswrapper[4774]: I1003 15:49:41.384567 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56v9k\" (UniqueName: \"kubernetes.io/projected/fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c-kube-api-access-56v9k\") pod \"community-operators-r8mcw\" (UID: \"fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c\") " pod="openshift-marketplace/community-operators-r8mcw" Oct 03 15:49:41 crc kubenswrapper[4774]: I1003 15:49:41.384795 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c-utilities\") pod \"community-operators-r8mcw\" (UID: \"fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c\") " pod="openshift-marketplace/community-operators-r8mcw" Oct 03 15:49:41 crc kubenswrapper[4774]: I1003 15:49:41.384905 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c-catalog-content\") pod \"community-operators-r8mcw\" (UID: \"fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c\") " pod="openshift-marketplace/community-operators-r8mcw" Oct 03 15:49:41 crc kubenswrapper[4774]: I1003 15:49:41.431135 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-425fn" Oct 03 15:49:41 crc kubenswrapper[4774]: I1003 15:49:41.486235 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56v9k\" (UniqueName: \"kubernetes.io/projected/fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c-kube-api-access-56v9k\") pod \"community-operators-r8mcw\" (UID: \"fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c\") " pod="openshift-marketplace/community-operators-r8mcw" Oct 03 15:49:41 crc kubenswrapper[4774]: I1003 15:49:41.486334 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c-utilities\") pod \"community-operators-r8mcw\" (UID: \"fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c\") " pod="openshift-marketplace/community-operators-r8mcw" Oct 03 15:49:41 crc kubenswrapper[4774]: I1003 15:49:41.486383 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c-catalog-content\") pod \"community-operators-r8mcw\" (UID: \"fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c\") " pod="openshift-marketplace/community-operators-r8mcw" Oct 03 15:49:41 crc kubenswrapper[4774]: I1003 15:49:41.486968 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c-catalog-content\") pod \"community-operators-r8mcw\" (UID: \"fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c\") " pod="openshift-marketplace/community-operators-r8mcw" Oct 03 15:49:41 crc kubenswrapper[4774]: I1003 15:49:41.487171 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c-utilities\") pod \"community-operators-r8mcw\" (UID: \"fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c\") " pod="openshift-marketplace/community-operators-r8mcw" Oct 03 15:49:41 crc kubenswrapper[4774]: I1003 15:49:41.518788 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56v9k\" (UniqueName: \"kubernetes.io/projected/fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c-kube-api-access-56v9k\") pod \"community-operators-r8mcw\" (UID: \"fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c\") " pod="openshift-marketplace/community-operators-r8mcw" Oct 03 15:49:41 crc kubenswrapper[4774]: I1003 15:49:41.687971 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r8mcw" Oct 03 15:49:42 crc kubenswrapper[4774]: I1003 15:49:42.093951 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-425fn"] Oct 03 15:49:42 crc kubenswrapper[4774]: I1003 15:49:42.255596 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r8mcw"] Oct 03 15:49:42 crc kubenswrapper[4774]: I1003 15:49:42.299592 4774 scope.go:117] "RemoveContainer" containerID="6a9aac6dbc322b9090f9a59162378fc7732101d04a69222bfa4d228837707411" Oct 03 15:49:42 crc kubenswrapper[4774]: E1003 15:49:42.300426 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:49:42 crc kubenswrapper[4774]: I1003 15:49:42.370930 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8mcw" event={"ID":"fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c","Type":"ContainerStarted","Data":"1b70f314f35977928c1bd5f89dc8e486a623dfb3b55d45bf9e2ca8706e19263f"} Oct 03 15:49:42 crc kubenswrapper[4774]: I1003 15:49:42.372300 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-425fn" event={"ID":"a5b55b99-f8ef-4f35-9b44-da46e01ee2eb","Type":"ContainerStarted","Data":"a0ecc55c855587b69d73d84c9ef847971394a6d445de7a1a10c0186e19278b4c"} Oct 03 15:49:43 crc kubenswrapper[4774]: I1003 15:49:43.382298 4774 generic.go:334] "Generic (PLEG): container finished" podID="fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c" containerID="40f66d871fb83bf4323d2569fb4c4d8459cc7bfaca2d5d022b290006b6ebd841" exitCode=0 Oct 03 15:49:43 crc kubenswrapper[4774]: I1003 15:49:43.382514 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8mcw" event={"ID":"fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c","Type":"ContainerDied","Data":"40f66d871fb83bf4323d2569fb4c4d8459cc7bfaca2d5d022b290006b6ebd841"} Oct 03 15:49:43 crc kubenswrapper[4774]: I1003 15:49:43.384525 4774 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 15:49:43 crc kubenswrapper[4774]: I1003 15:49:43.384911 4774 generic.go:334] "Generic (PLEG): container finished" podID="a5b55b99-f8ef-4f35-9b44-da46e01ee2eb" containerID="384ecf27159ada1a2480d791a90e4f037b294485484134b59242a9342c334fc1" exitCode=0 Oct 03 15:49:43 crc kubenswrapper[4774]: I1003 15:49:43.384936 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-425fn" event={"ID":"a5b55b99-f8ef-4f35-9b44-da46e01ee2eb","Type":"ContainerDied","Data":"384ecf27159ada1a2480d791a90e4f037b294485484134b59242a9342c334fc1"} Oct 03 15:49:47 crc kubenswrapper[4774]: I1003 15:49:47.431194 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8mcw" event={"ID":"fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c","Type":"ContainerStarted","Data":"7f5bc38a2bf1ab8cc4a43cfdf5dfa5a0ebb255cd249398dacec51bec81c031a4"} Oct 03 15:49:47 crc kubenswrapper[4774]: I1003 15:49:47.434700 4774 generic.go:334] "Generic (PLEG): container finished" podID="a5b55b99-f8ef-4f35-9b44-da46e01ee2eb" containerID="37bc821a75e692717f77369ac60007368e45c89e82bef97efb557b243f00d975" exitCode=0 Oct 03 15:49:47 crc kubenswrapper[4774]: I1003 15:49:47.434746 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-425fn" event={"ID":"a5b55b99-f8ef-4f35-9b44-da46e01ee2eb","Type":"ContainerDied","Data":"37bc821a75e692717f77369ac60007368e45c89e82bef97efb557b243f00d975"} Oct 03 15:49:48 crc kubenswrapper[4774]: I1003 15:49:48.453818 4774 generic.go:334] "Generic (PLEG): container finished" podID="fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c" containerID="7f5bc38a2bf1ab8cc4a43cfdf5dfa5a0ebb255cd249398dacec51bec81c031a4" exitCode=0 Oct 03 15:49:48 crc kubenswrapper[4774]: I1003 15:49:48.453861 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8mcw" event={"ID":"fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c","Type":"ContainerDied","Data":"7f5bc38a2bf1ab8cc4a43cfdf5dfa5a0ebb255cd249398dacec51bec81c031a4"} Oct 03 15:49:49 crc kubenswrapper[4774]: I1003 15:49:49.465877 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-425fn" event={"ID":"a5b55b99-f8ef-4f35-9b44-da46e01ee2eb","Type":"ContainerStarted","Data":"b8d63197a231e10eb2f7a9ecd6031144a9f4096e04038b6666a7b69953233868"} Oct 03 15:49:49 crc kubenswrapper[4774]: I1003 15:49:49.490843 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-425fn" podStartSLOduration=3.455695169 podStartE2EDuration="8.490821471s" podCreationTimestamp="2025-10-03 15:49:41 +0000 UTC" firstStartedPulling="2025-10-03 15:49:43.386035554 +0000 UTC m=+4005.975239006" lastFinishedPulling="2025-10-03 15:49:48.421161856 +0000 UTC m=+4011.010365308" observedRunningTime="2025-10-03 15:49:49.485163781 +0000 UTC m=+4012.074367243" watchObservedRunningTime="2025-10-03 15:49:49.490821471 +0000 UTC m=+4012.080024923" Oct 03 15:49:51 crc kubenswrapper[4774]: I1003 15:49:51.432256 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-425fn" Oct 03 15:49:51 crc kubenswrapper[4774]: I1003 15:49:51.432607 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-425fn" Oct 03 15:49:51 crc kubenswrapper[4774]: I1003 15:49:51.499612 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-425fn" Oct 03 15:49:54 crc kubenswrapper[4774]: I1003 15:49:54.299166 4774 scope.go:117] "RemoveContainer" containerID="6a9aac6dbc322b9090f9a59162378fc7732101d04a69222bfa4d228837707411" Oct 03 15:49:54 crc kubenswrapper[4774]: E1003 15:49:54.318458 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:49:55 crc kubenswrapper[4774]: I1003 15:49:55.518401 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8mcw" event={"ID":"fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c","Type":"ContainerStarted","Data":"a3f555c98211bd9a709a15f7bea420692c3d47deb489116ee52dca0e5cd0cd0f"} Oct 03 15:49:55 crc kubenswrapper[4774]: I1003 15:49:55.541339 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r8mcw" podStartSLOduration=4.289981217 podStartE2EDuration="14.541320514s" podCreationTimestamp="2025-10-03 15:49:41 +0000 UTC" firstStartedPulling="2025-10-03 15:49:43.384307392 +0000 UTC m=+4005.973510834" lastFinishedPulling="2025-10-03 15:49:53.635646669 +0000 UTC m=+4016.224850131" observedRunningTime="2025-10-03 15:49:55.534238068 +0000 UTC m=+4018.123441520" watchObservedRunningTime="2025-10-03 15:49:55.541320514 +0000 UTC m=+4018.130523966" Oct 03 15:50:01 crc kubenswrapper[4774]: I1003 15:50:01.481796 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-425fn" Oct 03 15:50:01 crc kubenswrapper[4774]: I1003 15:50:01.539571 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-425fn"] Oct 03 15:50:01 crc kubenswrapper[4774]: I1003 15:50:01.570363 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-425fn" podUID="a5b55b99-f8ef-4f35-9b44-da46e01ee2eb" containerName="registry-server" containerID="cri-o://b8d63197a231e10eb2f7a9ecd6031144a9f4096e04038b6666a7b69953233868" gracePeriod=2 Oct 03 15:50:01 crc kubenswrapper[4774]: I1003 15:50:01.688891 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r8mcw" Oct 03 15:50:01 crc kubenswrapper[4774]: I1003 15:50:01.689137 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r8mcw" Oct 03 15:50:01 crc kubenswrapper[4774]: I1003 15:50:01.748686 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r8mcw" Oct 03 15:50:02 crc kubenswrapper[4774]: I1003 15:50:02.088401 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-425fn" Oct 03 15:50:02 crc kubenswrapper[4774]: I1003 15:50:02.203688 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5b55b99-f8ef-4f35-9b44-da46e01ee2eb-utilities\") pod \"a5b55b99-f8ef-4f35-9b44-da46e01ee2eb\" (UID: \"a5b55b99-f8ef-4f35-9b44-da46e01ee2eb\") " Oct 03 15:50:02 crc kubenswrapper[4774]: I1003 15:50:02.204182 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64b2j\" (UniqueName: \"kubernetes.io/projected/a5b55b99-f8ef-4f35-9b44-da46e01ee2eb-kube-api-access-64b2j\") pod \"a5b55b99-f8ef-4f35-9b44-da46e01ee2eb\" (UID: \"a5b55b99-f8ef-4f35-9b44-da46e01ee2eb\") " Oct 03 15:50:02 crc kubenswrapper[4774]: I1003 15:50:02.204259 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5b55b99-f8ef-4f35-9b44-da46e01ee2eb-catalog-content\") pod \"a5b55b99-f8ef-4f35-9b44-da46e01ee2eb\" (UID: \"a5b55b99-f8ef-4f35-9b44-da46e01ee2eb\") " Oct 03 15:50:02 crc kubenswrapper[4774]: I1003 15:50:02.205307 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5b55b99-f8ef-4f35-9b44-da46e01ee2eb-utilities" (OuterVolumeSpecName: "utilities") pod "a5b55b99-f8ef-4f35-9b44-da46e01ee2eb" (UID: "a5b55b99-f8ef-4f35-9b44-da46e01ee2eb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:50:02 crc kubenswrapper[4774]: I1003 15:50:02.269244 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5b55b99-f8ef-4f35-9b44-da46e01ee2eb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a5b55b99-f8ef-4f35-9b44-da46e01ee2eb" (UID: "a5b55b99-f8ef-4f35-9b44-da46e01ee2eb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:50:02 crc kubenswrapper[4774]: I1003 15:50:02.307313 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5b55b99-f8ef-4f35-9b44-da46e01ee2eb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 15:50:02 crc kubenswrapper[4774]: I1003 15:50:02.307344 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5b55b99-f8ef-4f35-9b44-da46e01ee2eb-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 15:50:02 crc kubenswrapper[4774]: I1003 15:50:02.588282 4774 generic.go:334] "Generic (PLEG): container finished" podID="a5b55b99-f8ef-4f35-9b44-da46e01ee2eb" containerID="b8d63197a231e10eb2f7a9ecd6031144a9f4096e04038b6666a7b69953233868" exitCode=0 Oct 03 15:50:02 crc kubenswrapper[4774]: I1003 15:50:02.588491 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-425fn" Oct 03 15:50:02 crc kubenswrapper[4774]: I1003 15:50:02.588479 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-425fn" event={"ID":"a5b55b99-f8ef-4f35-9b44-da46e01ee2eb","Type":"ContainerDied","Data":"b8d63197a231e10eb2f7a9ecd6031144a9f4096e04038b6666a7b69953233868"} Oct 03 15:50:02 crc kubenswrapper[4774]: I1003 15:50:02.588560 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-425fn" event={"ID":"a5b55b99-f8ef-4f35-9b44-da46e01ee2eb","Type":"ContainerDied","Data":"a0ecc55c855587b69d73d84c9ef847971394a6d445de7a1a10c0186e19278b4c"} Oct 03 15:50:02 crc kubenswrapper[4774]: I1003 15:50:02.588686 4774 scope.go:117] "RemoveContainer" containerID="b8d63197a231e10eb2f7a9ecd6031144a9f4096e04038b6666a7b69953233868" Oct 03 15:50:02 crc kubenswrapper[4774]: I1003 15:50:02.620135 4774 scope.go:117] "RemoveContainer" containerID="37bc821a75e692717f77369ac60007368e45c89e82bef97efb557b243f00d975" Oct 03 15:50:02 crc kubenswrapper[4774]: I1003 15:50:02.649314 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5b55b99-f8ef-4f35-9b44-da46e01ee2eb-kube-api-access-64b2j" (OuterVolumeSpecName: "kube-api-access-64b2j") pod "a5b55b99-f8ef-4f35-9b44-da46e01ee2eb" (UID: "a5b55b99-f8ef-4f35-9b44-da46e01ee2eb"). InnerVolumeSpecName "kube-api-access-64b2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:50:02 crc kubenswrapper[4774]: I1003 15:50:02.666809 4774 scope.go:117] "RemoveContainer" containerID="384ecf27159ada1a2480d791a90e4f037b294485484134b59242a9342c334fc1" Oct 03 15:50:02 crc kubenswrapper[4774]: I1003 15:50:02.714452 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64b2j\" (UniqueName: \"kubernetes.io/projected/a5b55b99-f8ef-4f35-9b44-da46e01ee2eb-kube-api-access-64b2j\") on node \"crc\" DevicePath \"\"" Oct 03 15:50:02 crc kubenswrapper[4774]: I1003 15:50:02.801496 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r8mcw" Oct 03 15:50:02 crc kubenswrapper[4774]: I1003 15:50:02.868610 4774 scope.go:117] "RemoveContainer" containerID="b8d63197a231e10eb2f7a9ecd6031144a9f4096e04038b6666a7b69953233868" Oct 03 15:50:02 crc kubenswrapper[4774]: E1003 15:50:02.869201 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8d63197a231e10eb2f7a9ecd6031144a9f4096e04038b6666a7b69953233868\": container with ID starting with b8d63197a231e10eb2f7a9ecd6031144a9f4096e04038b6666a7b69953233868 not found: ID does not exist" containerID="b8d63197a231e10eb2f7a9ecd6031144a9f4096e04038b6666a7b69953233868" Oct 03 15:50:02 crc kubenswrapper[4774]: I1003 15:50:02.869247 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8d63197a231e10eb2f7a9ecd6031144a9f4096e04038b6666a7b69953233868"} err="failed to get container status \"b8d63197a231e10eb2f7a9ecd6031144a9f4096e04038b6666a7b69953233868\": rpc error: code = NotFound desc = could not find container \"b8d63197a231e10eb2f7a9ecd6031144a9f4096e04038b6666a7b69953233868\": container with ID starting with b8d63197a231e10eb2f7a9ecd6031144a9f4096e04038b6666a7b69953233868 not found: ID does not exist" Oct 03 15:50:02 crc kubenswrapper[4774]: I1003 15:50:02.869278 4774 scope.go:117] "RemoveContainer" containerID="37bc821a75e692717f77369ac60007368e45c89e82bef97efb557b243f00d975" Oct 03 15:50:02 crc kubenswrapper[4774]: E1003 15:50:02.869762 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37bc821a75e692717f77369ac60007368e45c89e82bef97efb557b243f00d975\": container with ID starting with 37bc821a75e692717f77369ac60007368e45c89e82bef97efb557b243f00d975 not found: ID does not exist" containerID="37bc821a75e692717f77369ac60007368e45c89e82bef97efb557b243f00d975" Oct 03 15:50:02 crc kubenswrapper[4774]: I1003 15:50:02.869799 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37bc821a75e692717f77369ac60007368e45c89e82bef97efb557b243f00d975"} err="failed to get container status \"37bc821a75e692717f77369ac60007368e45c89e82bef97efb557b243f00d975\": rpc error: code = NotFound desc = could not find container \"37bc821a75e692717f77369ac60007368e45c89e82bef97efb557b243f00d975\": container with ID starting with 37bc821a75e692717f77369ac60007368e45c89e82bef97efb557b243f00d975 not found: ID does not exist" Oct 03 15:50:02 crc kubenswrapper[4774]: I1003 15:50:02.869825 4774 scope.go:117] "RemoveContainer" containerID="384ecf27159ada1a2480d791a90e4f037b294485484134b59242a9342c334fc1" Oct 03 15:50:02 crc kubenswrapper[4774]: E1003 15:50:02.871424 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"384ecf27159ada1a2480d791a90e4f037b294485484134b59242a9342c334fc1\": container with ID starting with 384ecf27159ada1a2480d791a90e4f037b294485484134b59242a9342c334fc1 not found: ID does not exist" containerID="384ecf27159ada1a2480d791a90e4f037b294485484134b59242a9342c334fc1" Oct 03 15:50:02 crc kubenswrapper[4774]: I1003 15:50:02.871462 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"384ecf27159ada1a2480d791a90e4f037b294485484134b59242a9342c334fc1"} err="failed to get container status \"384ecf27159ada1a2480d791a90e4f037b294485484134b59242a9342c334fc1\": rpc error: code = NotFound desc = could not find container \"384ecf27159ada1a2480d791a90e4f037b294485484134b59242a9342c334fc1\": container with ID starting with 384ecf27159ada1a2480d791a90e4f037b294485484134b59242a9342c334fc1 not found: ID does not exist" Oct 03 15:50:02 crc kubenswrapper[4774]: I1003 15:50:02.929581 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-425fn"] Oct 03 15:50:02 crc kubenswrapper[4774]: I1003 15:50:02.937124 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-425fn"] Oct 03 15:50:03 crc kubenswrapper[4774]: I1003 15:50:03.314186 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5b55b99-f8ef-4f35-9b44-da46e01ee2eb" path="/var/lib/kubelet/pods/a5b55b99-f8ef-4f35-9b44-da46e01ee2eb/volumes" Oct 03 15:50:03 crc kubenswrapper[4774]: I1003 15:50:03.918364 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r8mcw"] Oct 03 15:50:04 crc kubenswrapper[4774]: I1003 15:50:04.607900 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r8mcw" podUID="fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c" containerName="registry-server" containerID="cri-o://a3f555c98211bd9a709a15f7bea420692c3d47deb489116ee52dca0e5cd0cd0f" gracePeriod=2 Oct 03 15:50:05 crc kubenswrapper[4774]: I1003 15:50:05.171935 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r8mcw" Oct 03 15:50:05 crc kubenswrapper[4774]: I1003 15:50:05.264888 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56v9k\" (UniqueName: \"kubernetes.io/projected/fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c-kube-api-access-56v9k\") pod \"fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c\" (UID: \"fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c\") " Oct 03 15:50:05 crc kubenswrapper[4774]: I1003 15:50:05.264988 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c-catalog-content\") pod \"fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c\" (UID: \"fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c\") " Oct 03 15:50:05 crc kubenswrapper[4774]: I1003 15:50:05.265176 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c-utilities\") pod \"fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c\" (UID: \"fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c\") " Oct 03 15:50:05 crc kubenswrapper[4774]: I1003 15:50:05.266044 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c-utilities" (OuterVolumeSpecName: "utilities") pod "fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c" (UID: "fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:50:05 crc kubenswrapper[4774]: I1003 15:50:05.279536 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c-kube-api-access-56v9k" (OuterVolumeSpecName: "kube-api-access-56v9k") pod "fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c" (UID: "fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c"). InnerVolumeSpecName "kube-api-access-56v9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:50:05 crc kubenswrapper[4774]: I1003 15:50:05.329526 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c" (UID: "fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:50:05 crc kubenswrapper[4774]: I1003 15:50:05.367839 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 15:50:05 crc kubenswrapper[4774]: I1003 15:50:05.367873 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 15:50:05 crc kubenswrapper[4774]: I1003 15:50:05.367885 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56v9k\" (UniqueName: \"kubernetes.io/projected/fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c-kube-api-access-56v9k\") on node \"crc\" DevicePath \"\"" Oct 03 15:50:05 crc kubenswrapper[4774]: I1003 15:50:05.620602 4774 generic.go:334] "Generic (PLEG): container finished" podID="fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c" containerID="a3f555c98211bd9a709a15f7bea420692c3d47deb489116ee52dca0e5cd0cd0f" exitCode=0 Oct 03 15:50:05 crc kubenswrapper[4774]: I1003 15:50:05.620998 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r8mcw" Oct 03 15:50:05 crc kubenswrapper[4774]: I1003 15:50:05.621028 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8mcw" event={"ID":"fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c","Type":"ContainerDied","Data":"a3f555c98211bd9a709a15f7bea420692c3d47deb489116ee52dca0e5cd0cd0f"} Oct 03 15:50:05 crc kubenswrapper[4774]: I1003 15:50:05.621081 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8mcw" event={"ID":"fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c","Type":"ContainerDied","Data":"1b70f314f35977928c1bd5f89dc8e486a623dfb3b55d45bf9e2ca8706e19263f"} Oct 03 15:50:05 crc kubenswrapper[4774]: I1003 15:50:05.621102 4774 scope.go:117] "RemoveContainer" containerID="a3f555c98211bd9a709a15f7bea420692c3d47deb489116ee52dca0e5cd0cd0f" Oct 03 15:50:05 crc kubenswrapper[4774]: I1003 15:50:05.652731 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r8mcw"] Oct 03 15:50:05 crc kubenswrapper[4774]: I1003 15:50:05.662467 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r8mcw"] Oct 03 15:50:05 crc kubenswrapper[4774]: I1003 15:50:05.666001 4774 scope.go:117] "RemoveContainer" containerID="7f5bc38a2bf1ab8cc4a43cfdf5dfa5a0ebb255cd249398dacec51bec81c031a4" Oct 03 15:50:05 crc kubenswrapper[4774]: I1003 15:50:05.691224 4774 scope.go:117] "RemoveContainer" containerID="40f66d871fb83bf4323d2569fb4c4d8459cc7bfaca2d5d022b290006b6ebd841" Oct 03 15:50:05 crc kubenswrapper[4774]: I1003 15:50:05.740113 4774 scope.go:117] "RemoveContainer" containerID="a3f555c98211bd9a709a15f7bea420692c3d47deb489116ee52dca0e5cd0cd0f" Oct 03 15:50:05 crc kubenswrapper[4774]: E1003 15:50:05.741777 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3f555c98211bd9a709a15f7bea420692c3d47deb489116ee52dca0e5cd0cd0f\": container with ID starting with a3f555c98211bd9a709a15f7bea420692c3d47deb489116ee52dca0e5cd0cd0f not found: ID does not exist" containerID="a3f555c98211bd9a709a15f7bea420692c3d47deb489116ee52dca0e5cd0cd0f" Oct 03 15:50:05 crc kubenswrapper[4774]: I1003 15:50:05.741827 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3f555c98211bd9a709a15f7bea420692c3d47deb489116ee52dca0e5cd0cd0f"} err="failed to get container status \"a3f555c98211bd9a709a15f7bea420692c3d47deb489116ee52dca0e5cd0cd0f\": rpc error: code = NotFound desc = could not find container \"a3f555c98211bd9a709a15f7bea420692c3d47deb489116ee52dca0e5cd0cd0f\": container with ID starting with a3f555c98211bd9a709a15f7bea420692c3d47deb489116ee52dca0e5cd0cd0f not found: ID does not exist" Oct 03 15:50:05 crc kubenswrapper[4774]: I1003 15:50:05.741854 4774 scope.go:117] "RemoveContainer" containerID="7f5bc38a2bf1ab8cc4a43cfdf5dfa5a0ebb255cd249398dacec51bec81c031a4" Oct 03 15:50:05 crc kubenswrapper[4774]: E1003 15:50:05.742104 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f5bc38a2bf1ab8cc4a43cfdf5dfa5a0ebb255cd249398dacec51bec81c031a4\": container with ID starting with 7f5bc38a2bf1ab8cc4a43cfdf5dfa5a0ebb255cd249398dacec51bec81c031a4 not found: ID does not exist" containerID="7f5bc38a2bf1ab8cc4a43cfdf5dfa5a0ebb255cd249398dacec51bec81c031a4" Oct 03 15:50:05 crc kubenswrapper[4774]: I1003 15:50:05.742164 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f5bc38a2bf1ab8cc4a43cfdf5dfa5a0ebb255cd249398dacec51bec81c031a4"} err="failed to get container status \"7f5bc38a2bf1ab8cc4a43cfdf5dfa5a0ebb255cd249398dacec51bec81c031a4\": rpc error: code = NotFound desc = could not find container \"7f5bc38a2bf1ab8cc4a43cfdf5dfa5a0ebb255cd249398dacec51bec81c031a4\": container with ID starting with 7f5bc38a2bf1ab8cc4a43cfdf5dfa5a0ebb255cd249398dacec51bec81c031a4 not found: ID does not exist" Oct 03 15:50:05 crc kubenswrapper[4774]: I1003 15:50:05.742193 4774 scope.go:117] "RemoveContainer" containerID="40f66d871fb83bf4323d2569fb4c4d8459cc7bfaca2d5d022b290006b6ebd841" Oct 03 15:50:05 crc kubenswrapper[4774]: E1003 15:50:05.742429 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40f66d871fb83bf4323d2569fb4c4d8459cc7bfaca2d5d022b290006b6ebd841\": container with ID starting with 40f66d871fb83bf4323d2569fb4c4d8459cc7bfaca2d5d022b290006b6ebd841 not found: ID does not exist" containerID="40f66d871fb83bf4323d2569fb4c4d8459cc7bfaca2d5d022b290006b6ebd841" Oct 03 15:50:05 crc kubenswrapper[4774]: I1003 15:50:05.742451 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40f66d871fb83bf4323d2569fb4c4d8459cc7bfaca2d5d022b290006b6ebd841"} err="failed to get container status \"40f66d871fb83bf4323d2569fb4c4d8459cc7bfaca2d5d022b290006b6ebd841\": rpc error: code = NotFound desc = could not find container \"40f66d871fb83bf4323d2569fb4c4d8459cc7bfaca2d5d022b290006b6ebd841\": container with ID starting with 40f66d871fb83bf4323d2569fb4c4d8459cc7bfaca2d5d022b290006b6ebd841 not found: ID does not exist" Oct 03 15:50:07 crc kubenswrapper[4774]: I1003 15:50:07.300237 4774 scope.go:117] "RemoveContainer" containerID="6a9aac6dbc322b9090f9a59162378fc7732101d04a69222bfa4d228837707411" Oct 03 15:50:07 crc kubenswrapper[4774]: E1003 15:50:07.300796 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:50:07 crc kubenswrapper[4774]: I1003 15:50:07.311248 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c" path="/var/lib/kubelet/pods/fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c/volumes" Oct 03 15:50:17 crc kubenswrapper[4774]: I1003 15:50:17.235802 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5587b8897d-cknc7_79e2926f-2fec-48fb-95d2-c3afcfec7c4c/barbican-api/0.log" Oct 03 15:50:17 crc kubenswrapper[4774]: I1003 15:50:17.275152 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5587b8897d-cknc7_79e2926f-2fec-48fb-95d2-c3afcfec7c4c/barbican-api-log/0.log" Oct 03 15:50:17 crc kubenswrapper[4774]: I1003 15:50:17.423693 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5b679cb88b-5lrzw_3ceaeb2a-3322-44b9-88ae-5c473721a68f/barbican-keystone-listener/0.log" Oct 03 15:50:17 crc kubenswrapper[4774]: I1003 15:50:17.475440 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5b679cb88b-5lrzw_3ceaeb2a-3322-44b9-88ae-5c473721a68f/barbican-keystone-listener-log/0.log" Oct 03 15:50:17 crc kubenswrapper[4774]: I1003 15:50:17.644327 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-74d9f6d95f-l289b_04d07033-b1e8-426d-828b-e78cb0f44294/barbican-worker/0.log" Oct 03 15:50:17 crc kubenswrapper[4774]: I1003 15:50:17.656334 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-74d9f6d95f-l289b_04d07033-b1e8-426d-828b-e78cb0f44294/barbican-worker-log/0.log" Oct 03 15:50:17 crc kubenswrapper[4774]: I1003 15:50:17.869613 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-9h8nb_d70dce54-aa22-4af1-a341-4ff90ba78722/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:50:18 crc kubenswrapper[4774]: I1003 15:50:18.085434 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_cebee680-e845-4735-95f6-e97d844399a3/ceilometer-central-agent/0.log" Oct 03 15:50:18 crc kubenswrapper[4774]: I1003 15:50:18.107511 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_cebee680-e845-4735-95f6-e97d844399a3/ceilometer-notification-agent/0.log" Oct 03 15:50:18 crc kubenswrapper[4774]: I1003 15:50:18.128366 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_cebee680-e845-4735-95f6-e97d844399a3/proxy-httpd/0.log" Oct 03 15:50:18 crc kubenswrapper[4774]: I1003 15:50:18.250524 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_cebee680-e845-4735-95f6-e97d844399a3/sg-core/0.log" Oct 03 15:50:18 crc kubenswrapper[4774]: I1003 15:50:18.299235 4774 scope.go:117] "RemoveContainer" containerID="6a9aac6dbc322b9090f9a59162378fc7732101d04a69222bfa4d228837707411" Oct 03 15:50:18 crc kubenswrapper[4774]: E1003 15:50:18.299577 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s6v5z_openshift-machine-config-operator(ca37ac4b-f421-4198-a179-12901d36f0f5)\"" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" Oct 03 15:50:18 crc kubenswrapper[4774]: I1003 15:50:18.379908 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c2605586-9dec-4e4f-a61d-7a93535cbaa2/cinder-api/0.log" Oct 03 15:50:18 crc kubenswrapper[4774]: I1003 15:50:18.495381 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c2605586-9dec-4e4f-a61d-7a93535cbaa2/cinder-api-log/0.log" Oct 03 15:50:18 crc kubenswrapper[4774]: I1003 15:50:18.615712 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a370be32-2d52-48b7-b529-53e1d92a89a9/cinder-scheduler/0.log" Oct 03 15:50:18 crc kubenswrapper[4774]: I1003 15:50:18.732233 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a370be32-2d52-48b7-b529-53e1d92a89a9/probe/0.log" Oct 03 15:50:18 crc kubenswrapper[4774]: I1003 15:50:18.975904 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-pnvh8_9d27605e-3b35-4000-a3d5-88cecbf24b5a/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:50:19 crc kubenswrapper[4774]: I1003 15:50:19.113239 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-6shkp_9522d412-aaac-4917-86a0-2d9c40830b8d/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:50:19 crc kubenswrapper[4774]: I1003 15:50:19.278539 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-97dwl_365904b3-7404-4fe1-a9bf-c2711b345c08/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:50:19 crc kubenswrapper[4774]: I1003 15:50:19.399524 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-vrzcl_6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4/init/0.log" Oct 03 15:50:19 crc kubenswrapper[4774]: I1003 15:50:19.589797 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-vrzcl_6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4/dnsmasq-dns/0.log" Oct 03 15:50:19 crc kubenswrapper[4774]: I1003 15:50:19.606363 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-vrzcl_6fd3f029-cab3-4e3f-ab8e-8f156c6bdaf4/init/0.log" Oct 03 15:50:19 crc kubenswrapper[4774]: I1003 15:50:19.768698 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-bh568_58fbe1ae-46c5-4bb6-99ec-61ca05d737b1/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:50:19 crc kubenswrapper[4774]: I1003 15:50:19.858041 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_576a053d-3110-4bf1-a079-512e6bc51cbe/glance-httpd/0.log" Oct 03 15:50:19 crc kubenswrapper[4774]: I1003 15:50:19.953504 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_576a053d-3110-4bf1-a079-512e6bc51cbe/glance-log/0.log" Oct 03 15:50:20 crc kubenswrapper[4774]: I1003 15:50:20.114805 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_24f9335f-6756-4590-9e0a-6bc7bd1f4b3e/glance-httpd/0.log" Oct 03 15:50:20 crc kubenswrapper[4774]: I1003 15:50:20.149163 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_24f9335f-6756-4590-9e0a-6bc7bd1f4b3e/glance-log/0.log" Oct 03 15:50:20 crc kubenswrapper[4774]: I1003 15:50:20.460391 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-f865bb968-k9r7v_c0b4826d-75e2-4023-8d53-3ddd0da5bc2e/horizon/0.log" Oct 03 15:50:20 crc kubenswrapper[4774]: I1003 15:50:20.488357 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-x58cf_2ba66309-584b-4165-86a5-ca30af49d159/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:50:20 crc kubenswrapper[4774]: I1003 15:50:20.712608 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-tzhjs_c8595599-9969-4c2e-bc3a-f2ff038d8c11/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:50:20 crc kubenswrapper[4774]: I1003 15:50:20.731660 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-f865bb968-k9r7v_c0b4826d-75e2-4023-8d53-3ddd0da5bc2e/horizon-log/0.log" Oct 03 15:50:20 crc kubenswrapper[4774]: I1003 15:50:20.935740 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_a584407c-d7f9-436b-a293-fe97f4ed3c78/kube-state-metrics/0.log" Oct 03 15:50:21 crc kubenswrapper[4774]: I1003 15:50:21.025781 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7444dd849d-z82k5_9be0f44b-e4c6-475d-976b-d0b30b456b9c/keystone-api/0.log" Oct 03 15:50:21 crc kubenswrapper[4774]: I1003 15:50:21.135268 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-xz8v2_41e52a3d-812e-4067-a30e-e9f4ad329411/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:50:22 crc kubenswrapper[4774]: I1003 15:50:22.077048 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-987467b4f-dts4l_c5ff7df6-e04c-431b-bdbb-2579172a7706/neutron-httpd/0.log" Oct 03 15:50:22 crc kubenswrapper[4774]: I1003 15:50:22.100989 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-987467b4f-dts4l_c5ff7df6-e04c-431b-bdbb-2579172a7706/neutron-api/0.log" Oct 03 15:50:22 crc kubenswrapper[4774]: I1003 15:50:22.318851 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-kc6g6_62582218-c190-4edd-8539-5ca8e8d348e3/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:50:22 crc kubenswrapper[4774]: I1003 15:50:22.895924 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_96557def-81a7-44a3-86d4-72e10daa7d68/nova-api-log/0.log" Oct 03 15:50:23 crc kubenswrapper[4774]: I1003 15:50:23.101578 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_747916db-cbde-4597-be0a-1e2034b1afca/nova-cell0-conductor-conductor/0.log" Oct 03 15:50:23 crc kubenswrapper[4774]: I1003 15:50:23.246694 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_96557def-81a7-44a3-86d4-72e10daa7d68/nova-api-api/0.log" Oct 03 15:50:23 crc kubenswrapper[4774]: I1003 15:50:23.740244 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_18303598-0afd-48d0-a93a-a523807d8e37/nova-cell1-conductor-conductor/0.log" Oct 03 15:50:23 crc kubenswrapper[4774]: I1003 15:50:23.778916 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_fec122f7-2237-49c0-b5aa-4e251827b058/nova-cell1-novncproxy-novncproxy/0.log" Oct 03 15:50:24 crc kubenswrapper[4774]: I1003 15:50:24.040355 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-8vrlt_a3033c9b-77df-46fe-b9f6-34fedecfbdc8/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:50:24 crc kubenswrapper[4774]: I1003 15:50:24.245947 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_3208c93b-4d66-477f-8255-677d70a111a1/nova-metadata-log/0.log" Oct 03 15:50:24 crc kubenswrapper[4774]: I1003 15:50:24.699113 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_e38e5521-609a-4612-ad67-512c8a477e77/nova-scheduler-scheduler/0.log" Oct 03 15:50:24 crc kubenswrapper[4774]: I1003 15:50:24.720463 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_27254696-8788-47fe-a9a4-208fd295e427/mysql-bootstrap/0.log" Oct 03 15:50:24 crc kubenswrapper[4774]: I1003 15:50:24.964986 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_27254696-8788-47fe-a9a4-208fd295e427/mysql-bootstrap/0.log" Oct 03 15:50:24 crc kubenswrapper[4774]: I1003 15:50:24.979041 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_27254696-8788-47fe-a9a4-208fd295e427/galera/0.log" Oct 03 15:50:25 crc kubenswrapper[4774]: I1003 15:50:25.263596 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a5455d9b-4489-4041-b44d-990124dd84e4/mysql-bootstrap/0.log" Oct 03 15:50:25 crc kubenswrapper[4774]: I1003 15:50:25.416743 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a5455d9b-4489-4041-b44d-990124dd84e4/mysql-bootstrap/0.log" Oct 03 15:50:25 crc kubenswrapper[4774]: I1003 15:50:25.518110 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a5455d9b-4489-4041-b44d-990124dd84e4/galera/0.log" Oct 03 15:50:25 crc kubenswrapper[4774]: I1003 15:50:25.598826 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_3208c93b-4d66-477f-8255-677d70a111a1/nova-metadata-metadata/0.log" Oct 03 15:50:25 crc kubenswrapper[4774]: I1003 15:50:25.750502 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_f975364c-ff2b-49bb-9e5e-c0fea0d15daa/openstackclient/0.log" Oct 03 15:50:25 crc kubenswrapper[4774]: I1003 15:50:25.839157 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-4wgb7_b9111154-59d2-4b07-b8c3-db1870883cde/ovn-controller/0.log" Oct 03 15:50:26 crc kubenswrapper[4774]: I1003 15:50:26.085476 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-kh252_b62dc8be-5127-4f62-bdf9-f1db2425c2c1/openstack-network-exporter/0.log" Oct 03 15:50:26 crc kubenswrapper[4774]: I1003 15:50:26.196968 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bhmcl_03ee58bb-cd78-4fdb-986f-a9b60f9998e8/ovsdb-server-init/0.log" Oct 03 15:50:26 crc kubenswrapper[4774]: I1003 15:50:26.412605 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bhmcl_03ee58bb-cd78-4fdb-986f-a9b60f9998e8/ovs-vswitchd/0.log" Oct 03 15:50:26 crc kubenswrapper[4774]: I1003 15:50:26.414436 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bhmcl_03ee58bb-cd78-4fdb-986f-a9b60f9998e8/ovsdb-server/0.log" Oct 03 15:50:26 crc kubenswrapper[4774]: I1003 15:50:26.432326 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bhmcl_03ee58bb-cd78-4fdb-986f-a9b60f9998e8/ovsdb-server-init/0.log" Oct 03 15:50:26 crc kubenswrapper[4774]: I1003 15:50:26.631564 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-45z47_579db85b-3b4c-45b4-8bae-1b5a02c80e15/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:50:26 crc kubenswrapper[4774]: I1003 15:50:26.834559 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_148decd8-3262-44d1-858e-523458f7c1ee/openstack-network-exporter/0.log" Oct 03 15:50:26 crc kubenswrapper[4774]: I1003 15:50:26.859534 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_148decd8-3262-44d1-858e-523458f7c1ee/ovn-northd/0.log" Oct 03 15:50:27 crc kubenswrapper[4774]: I1003 15:50:27.095182 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66/openstack-network-exporter/0.log" Oct 03 15:50:27 crc kubenswrapper[4774]: I1003 15:50:27.098794 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_fa7eacfe-0cda-4c1a-be8c-dedca3dc5c66/ovsdbserver-nb/0.log" Oct 03 15:50:27 crc kubenswrapper[4774]: I1003 15:50:27.331029 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c880318a-5ff5-46f8-aca9-134c52ed3ad1/ovsdbserver-sb/0.log" Oct 03 15:50:27 crc kubenswrapper[4774]: I1003 15:50:27.341243 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c880318a-5ff5-46f8-aca9-134c52ed3ad1/openstack-network-exporter/0.log" Oct 03 15:50:27 crc kubenswrapper[4774]: I1003 15:50:27.671289 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-bc86db558-frxdt_e952bb63-3c66-43a3-a8ef-34e636f1b400/placement-api/0.log" Oct 03 15:50:27 crc kubenswrapper[4774]: I1003 15:50:27.689740 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-bc86db558-frxdt_e952bb63-3c66-43a3-a8ef-34e636f1b400/placement-log/0.log" Oct 03 15:50:27 crc kubenswrapper[4774]: I1003 15:50:27.837099 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_417bcf92-1c5e-4977-a197-62b603b795a2/setup-container/0.log" Oct 03 15:50:28 crc kubenswrapper[4774]: I1003 15:50:28.129000 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_417bcf92-1c5e-4977-a197-62b603b795a2/setup-container/0.log" Oct 03 15:50:28 crc kubenswrapper[4774]: I1003 15:50:28.143297 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_417bcf92-1c5e-4977-a197-62b603b795a2/rabbitmq/0.log" Oct 03 15:50:28 crc kubenswrapper[4774]: I1003 15:50:28.360489 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6d6cfa86-4356-4d79-9edd-977355592186/setup-container/0.log" Oct 03 15:50:28 crc kubenswrapper[4774]: I1003 15:50:28.531264 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6d6cfa86-4356-4d79-9edd-977355592186/rabbitmq/0.log" Oct 03 15:50:28 crc kubenswrapper[4774]: I1003 15:50:28.615031 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6d6cfa86-4356-4d79-9edd-977355592186/setup-container/0.log" Oct 03 15:50:28 crc kubenswrapper[4774]: I1003 15:50:28.773486 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-dshx4_776f45f5-5644-428e-a25f-9e3b36960fd9/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:50:28 crc kubenswrapper[4774]: I1003 15:50:28.908795 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-vchpx_4d2ae95f-a86b-4b58-a529-7b5d426bff79/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:50:29 crc kubenswrapper[4774]: I1003 15:50:29.123692 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-r6pdh_136d617b-f485-4841-b6e2-350b591cd22e/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:50:29 crc kubenswrapper[4774]: I1003 15:50:29.204210 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-mrdws_81364ab7-a73d-4fef-b065-62983751634b/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:50:29 crc kubenswrapper[4774]: I1003 15:50:29.422461 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-rnpx9_04fa8c76-a20b-4ae9-86f0-fe4801763d0e/ssh-known-hosts-edpm-deployment/0.log" Oct 03 15:50:29 crc kubenswrapper[4774]: I1003 15:50:29.665287 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5c49d658df-5r5zg_4f82147b-63cd-44bc-8950-bf87fa407688/proxy-server/0.log" Oct 03 15:50:29 crc kubenswrapper[4774]: I1003 15:50:29.818131 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5c49d658df-5r5zg_4f82147b-63cd-44bc-8950-bf87fa407688/proxy-httpd/0.log" Oct 03 15:50:29 crc kubenswrapper[4774]: I1003 15:50:29.830826 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-xxj5b_5244bd24-f205-4576-a7cd-6da859f28e21/swift-ring-rebalance/0.log" Oct 03 15:50:30 crc kubenswrapper[4774]: I1003 15:50:30.033115 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc0b2c39-8c1e-4401-97f6-a4306b435436/account-auditor/0.log" Oct 03 15:50:30 crc kubenswrapper[4774]: I1003 15:50:30.033815 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc0b2c39-8c1e-4401-97f6-a4306b435436/account-reaper/0.log" Oct 03 15:50:30 crc kubenswrapper[4774]: I1003 15:50:30.212510 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc0b2c39-8c1e-4401-97f6-a4306b435436/container-auditor/0.log" Oct 03 15:50:30 crc kubenswrapper[4774]: I1003 15:50:30.232253 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc0b2c39-8c1e-4401-97f6-a4306b435436/account-replicator/0.log" Oct 03 15:50:30 crc kubenswrapper[4774]: I1003 15:50:30.327272 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc0b2c39-8c1e-4401-97f6-a4306b435436/account-server/0.log" Oct 03 15:50:30 crc kubenswrapper[4774]: I1003 15:50:30.416515 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc0b2c39-8c1e-4401-97f6-a4306b435436/container-server/0.log" Oct 03 15:50:30 crc kubenswrapper[4774]: I1003 15:50:30.468667 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc0b2c39-8c1e-4401-97f6-a4306b435436/container-replicator/0.log" Oct 03 15:50:30 crc kubenswrapper[4774]: I1003 15:50:30.552151 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc0b2c39-8c1e-4401-97f6-a4306b435436/container-updater/0.log" Oct 03 15:50:30 crc kubenswrapper[4774]: I1003 15:50:30.665179 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc0b2c39-8c1e-4401-97f6-a4306b435436/object-auditor/0.log" Oct 03 15:50:30 crc kubenswrapper[4774]: I1003 15:50:30.709112 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc0b2c39-8c1e-4401-97f6-a4306b435436/object-expirer/0.log" Oct 03 15:50:30 crc kubenswrapper[4774]: I1003 15:50:30.781132 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc0b2c39-8c1e-4401-97f6-a4306b435436/object-replicator/0.log" Oct 03 15:50:30 crc kubenswrapper[4774]: I1003 15:50:30.838168 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc0b2c39-8c1e-4401-97f6-a4306b435436/object-server/0.log" Oct 03 15:50:30 crc kubenswrapper[4774]: I1003 15:50:30.945434 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc0b2c39-8c1e-4401-97f6-a4306b435436/object-updater/0.log" Oct 03 15:50:31 crc kubenswrapper[4774]: I1003 15:50:31.017009 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc0b2c39-8c1e-4401-97f6-a4306b435436/rsync/0.log" Oct 03 15:50:31 crc kubenswrapper[4774]: I1003 15:50:31.455470 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc0b2c39-8c1e-4401-97f6-a4306b435436/swift-recon-cron/0.log" Oct 03 15:50:31 crc kubenswrapper[4774]: I1003 15:50:31.497814 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-dvvf7_433217d2-80d5-452b-9980-c1aaac39b5c1/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:50:31 crc kubenswrapper[4774]: I1003 15:50:31.746495 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_7b63f9fa-2194-46e4-bfe0-d7efb33f10fb/tempest-tests-tempest-tests-runner/0.log" Oct 03 15:50:31 crc kubenswrapper[4774]: I1003 15:50:31.911755 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_90552d32-4d94-4fc4-b843-60a78206b347/test-operator-logs-container/0.log" Oct 03 15:50:32 crc kubenswrapper[4774]: I1003 15:50:32.048836 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-lkc9p_b021b515-09e3-4fcd-b448-c8169043f86c/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:50:33 crc kubenswrapper[4774]: I1003 15:50:33.304410 4774 scope.go:117] "RemoveContainer" containerID="6a9aac6dbc322b9090f9a59162378fc7732101d04a69222bfa4d228837707411" Oct 03 15:50:33 crc kubenswrapper[4774]: I1003 15:50:33.884502 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" event={"ID":"ca37ac4b-f421-4198-a179-12901d36f0f5","Type":"ContainerStarted","Data":"977d7962c88000b86f065a324347466b050767bdc139cbee70676501833807d1"} Oct 03 15:50:40 crc kubenswrapper[4774]: I1003 15:50:40.313036 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_09034f5f-3011-4604-8b05-f8a3fef6a74a/memcached/0.log" Oct 03 15:51:08 crc kubenswrapper[4774]: I1003 15:51:08.191423 4774 generic.go:334] "Generic (PLEG): container finished" podID="595ae358-51f7-4e1f-830d-a21614c7726c" containerID="0f125f54ee63c120a173107e0f934ef6dfaec0b7ef3cb99aceb30884c4e5a1ef" exitCode=0 Oct 03 15:51:08 crc kubenswrapper[4774]: I1003 15:51:08.191992 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2968w/crc-debug-zhtcc" event={"ID":"595ae358-51f7-4e1f-830d-a21614c7726c","Type":"ContainerDied","Data":"0f125f54ee63c120a173107e0f934ef6dfaec0b7ef3cb99aceb30884c4e5a1ef"} Oct 03 15:51:09 crc kubenswrapper[4774]: I1003 15:51:09.308045 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2968w/crc-debug-zhtcc" Oct 03 15:51:09 crc kubenswrapper[4774]: I1003 15:51:09.341607 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-must-gather-2968w/crc-debug-zhtcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"595ae358-51f7-4e1f-830d-a21614c7726c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T15:51:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"}]}}\" for pod \"openshift-must-gather-2968w\"/\"crc-debug-zhtcc\": pods \"crc-debug-zhtcc\" not found" Oct 03 15:51:09 crc kubenswrapper[4774]: I1003 15:51:09.349915 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2968w/crc-debug-zhtcc"] Oct 03 15:51:09 crc kubenswrapper[4774]: I1003 15:51:09.356692 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2968w/crc-debug-zhtcc"] Oct 03 15:51:09 crc kubenswrapper[4774]: I1003 15:51:09.484490 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5qxc\" (UniqueName: \"kubernetes.io/projected/595ae358-51f7-4e1f-830d-a21614c7726c-kube-api-access-f5qxc\") pod \"595ae358-51f7-4e1f-830d-a21614c7726c\" (UID: \"595ae358-51f7-4e1f-830d-a21614c7726c\") " Oct 03 15:51:09 crc kubenswrapper[4774]: I1003 15:51:09.484689 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/595ae358-51f7-4e1f-830d-a21614c7726c-host\") pod \"595ae358-51f7-4e1f-830d-a21614c7726c\" (UID: \"595ae358-51f7-4e1f-830d-a21614c7726c\") " Oct 03 15:51:09 crc kubenswrapper[4774]: I1003 15:51:09.484841 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/595ae358-51f7-4e1f-830d-a21614c7726c-host" (OuterVolumeSpecName: "host") pod "595ae358-51f7-4e1f-830d-a21614c7726c" (UID: "595ae358-51f7-4e1f-830d-a21614c7726c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 15:51:09 crc kubenswrapper[4774]: I1003 15:51:09.485549 4774 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/595ae358-51f7-4e1f-830d-a21614c7726c-host\") on node \"crc\" DevicePath \"\"" Oct 03 15:51:09 crc kubenswrapper[4774]: I1003 15:51:09.502657 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/595ae358-51f7-4e1f-830d-a21614c7726c-kube-api-access-f5qxc" (OuterVolumeSpecName: "kube-api-access-f5qxc") pod "595ae358-51f7-4e1f-830d-a21614c7726c" (UID: "595ae358-51f7-4e1f-830d-a21614c7726c"). InnerVolumeSpecName "kube-api-access-f5qxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:51:09 crc kubenswrapper[4774]: I1003 15:51:09.588060 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5qxc\" (UniqueName: \"kubernetes.io/projected/595ae358-51f7-4e1f-830d-a21614c7726c-kube-api-access-f5qxc\") on node \"crc\" DevicePath \"\"" Oct 03 15:51:10 crc kubenswrapper[4774]: I1003 15:51:10.216757 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18a818b00b83d0c659bb21bab4655b53706935ae5992a8a260cc8ad5fedceb18" Oct 03 15:51:10 crc kubenswrapper[4774]: I1003 15:51:10.216917 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2968w/crc-debug-zhtcc" Oct 03 15:51:10 crc kubenswrapper[4774]: I1003 15:51:10.525467 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2968w/crc-debug-5l54k"] Oct 03 15:51:10 crc kubenswrapper[4774]: E1003 15:51:10.526194 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c" containerName="extract-content" Oct 03 15:51:10 crc kubenswrapper[4774]: I1003 15:51:10.526210 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c" containerName="extract-content" Oct 03 15:51:10 crc kubenswrapper[4774]: E1003 15:51:10.526230 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5b55b99-f8ef-4f35-9b44-da46e01ee2eb" containerName="extract-utilities" Oct 03 15:51:10 crc kubenswrapper[4774]: I1003 15:51:10.526237 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5b55b99-f8ef-4f35-9b44-da46e01ee2eb" containerName="extract-utilities" Oct 03 15:51:10 crc kubenswrapper[4774]: E1003 15:51:10.526248 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5b55b99-f8ef-4f35-9b44-da46e01ee2eb" containerName="registry-server" Oct 03 15:51:10 crc kubenswrapper[4774]: I1003 15:51:10.526254 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5b55b99-f8ef-4f35-9b44-da46e01ee2eb" containerName="registry-server" Oct 03 15:51:10 crc kubenswrapper[4774]: E1003 15:51:10.526273 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c" containerName="registry-server" Oct 03 15:51:10 crc kubenswrapper[4774]: I1003 15:51:10.526279 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c" containerName="registry-server" Oct 03 15:51:10 crc kubenswrapper[4774]: E1003 15:51:10.526292 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="595ae358-51f7-4e1f-830d-a21614c7726c" containerName="container-00" Oct 03 15:51:10 crc kubenswrapper[4774]: I1003 15:51:10.526298 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="595ae358-51f7-4e1f-830d-a21614c7726c" containerName="container-00" Oct 03 15:51:10 crc kubenswrapper[4774]: E1003 15:51:10.526309 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c" containerName="extract-utilities" Oct 03 15:51:10 crc kubenswrapper[4774]: I1003 15:51:10.526316 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c" containerName="extract-utilities" Oct 03 15:51:10 crc kubenswrapper[4774]: E1003 15:51:10.526324 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5b55b99-f8ef-4f35-9b44-da46e01ee2eb" containerName="extract-content" Oct 03 15:51:10 crc kubenswrapper[4774]: I1003 15:51:10.526331 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5b55b99-f8ef-4f35-9b44-da46e01ee2eb" containerName="extract-content" Oct 03 15:51:10 crc kubenswrapper[4774]: I1003 15:51:10.526568 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="595ae358-51f7-4e1f-830d-a21614c7726c" containerName="container-00" Oct 03 15:51:10 crc kubenswrapper[4774]: I1003 15:51:10.526586 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5b55b99-f8ef-4f35-9b44-da46e01ee2eb" containerName="registry-server" Oct 03 15:51:10 crc kubenswrapper[4774]: I1003 15:51:10.526595 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb548d4c-da1a-4e4d-9f2f-36bcb1087e1c" containerName="registry-server" Oct 03 15:51:10 crc kubenswrapper[4774]: I1003 15:51:10.527276 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2968w/crc-debug-5l54k" Oct 03 15:51:10 crc kubenswrapper[4774]: I1003 15:51:10.529914 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-2968w"/"default-dockercfg-vcgvt" Oct 03 15:51:10 crc kubenswrapper[4774]: I1003 15:51:10.709444 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr9h6\" (UniqueName: \"kubernetes.io/projected/d74de5f0-638a-49d7-a098-0616e215e17d-kube-api-access-cr9h6\") pod \"crc-debug-5l54k\" (UID: \"d74de5f0-638a-49d7-a098-0616e215e17d\") " pod="openshift-must-gather-2968w/crc-debug-5l54k" Oct 03 15:51:10 crc kubenswrapper[4774]: I1003 15:51:10.709647 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d74de5f0-638a-49d7-a098-0616e215e17d-host\") pod \"crc-debug-5l54k\" (UID: \"d74de5f0-638a-49d7-a098-0616e215e17d\") " pod="openshift-must-gather-2968w/crc-debug-5l54k" Oct 03 15:51:10 crc kubenswrapper[4774]: I1003 15:51:10.811608 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr9h6\" (UniqueName: \"kubernetes.io/projected/d74de5f0-638a-49d7-a098-0616e215e17d-kube-api-access-cr9h6\") pod \"crc-debug-5l54k\" (UID: \"d74de5f0-638a-49d7-a098-0616e215e17d\") " pod="openshift-must-gather-2968w/crc-debug-5l54k" Oct 03 15:51:10 crc kubenswrapper[4774]: I1003 15:51:10.811799 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d74de5f0-638a-49d7-a098-0616e215e17d-host\") pod \"crc-debug-5l54k\" (UID: \"d74de5f0-638a-49d7-a098-0616e215e17d\") " pod="openshift-must-gather-2968w/crc-debug-5l54k" Oct 03 15:51:10 crc kubenswrapper[4774]: I1003 15:51:10.811942 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d74de5f0-638a-49d7-a098-0616e215e17d-host\") pod \"crc-debug-5l54k\" (UID: \"d74de5f0-638a-49d7-a098-0616e215e17d\") " pod="openshift-must-gather-2968w/crc-debug-5l54k" Oct 03 15:51:10 crc kubenswrapper[4774]: I1003 15:51:10.833971 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr9h6\" (UniqueName: \"kubernetes.io/projected/d74de5f0-638a-49d7-a098-0616e215e17d-kube-api-access-cr9h6\") pod \"crc-debug-5l54k\" (UID: \"d74de5f0-638a-49d7-a098-0616e215e17d\") " pod="openshift-must-gather-2968w/crc-debug-5l54k" Oct 03 15:51:10 crc kubenswrapper[4774]: I1003 15:51:10.846965 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2968w/crc-debug-5l54k" Oct 03 15:51:11 crc kubenswrapper[4774]: I1003 15:51:11.235463 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2968w/crc-debug-5l54k" event={"ID":"d74de5f0-638a-49d7-a098-0616e215e17d","Type":"ContainerStarted","Data":"28e0cd91df6ba30c25f7fe58c44e559165eb2dad94db789d13c72fee0c05f71e"} Oct 03 15:51:11 crc kubenswrapper[4774]: I1003 15:51:11.235903 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2968w/crc-debug-5l54k" event={"ID":"d74de5f0-638a-49d7-a098-0616e215e17d","Type":"ContainerStarted","Data":"4e2d5677c830409b95ab073612736f8fbc16df63b06fa13a1ed8b4557b2d55da"} Oct 03 15:51:11 crc kubenswrapper[4774]: I1003 15:51:11.260710 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2968w/crc-debug-5l54k" podStartSLOduration=1.260686355 podStartE2EDuration="1.260686355s" podCreationTimestamp="2025-10-03 15:51:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:51:11.255488236 +0000 UTC m=+4093.844691708" watchObservedRunningTime="2025-10-03 15:51:11.260686355 +0000 UTC m=+4093.849889807" Oct 03 15:51:11 crc kubenswrapper[4774]: I1003 15:51:11.319058 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="595ae358-51f7-4e1f-830d-a21614c7726c" path="/var/lib/kubelet/pods/595ae358-51f7-4e1f-830d-a21614c7726c/volumes" Oct 03 15:51:12 crc kubenswrapper[4774]: I1003 15:51:12.246142 4774 generic.go:334] "Generic (PLEG): container finished" podID="d74de5f0-638a-49d7-a098-0616e215e17d" containerID="28e0cd91df6ba30c25f7fe58c44e559165eb2dad94db789d13c72fee0c05f71e" exitCode=0 Oct 03 15:51:12 crc kubenswrapper[4774]: I1003 15:51:12.246222 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2968w/crc-debug-5l54k" event={"ID":"d74de5f0-638a-49d7-a098-0616e215e17d","Type":"ContainerDied","Data":"28e0cd91df6ba30c25f7fe58c44e559165eb2dad94db789d13c72fee0c05f71e"} Oct 03 15:51:13 crc kubenswrapper[4774]: I1003 15:51:13.354856 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2968w/crc-debug-5l54k" Oct 03 15:51:13 crc kubenswrapper[4774]: I1003 15:51:13.454572 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d74de5f0-638a-49d7-a098-0616e215e17d-host\") pod \"d74de5f0-638a-49d7-a098-0616e215e17d\" (UID: \"d74de5f0-638a-49d7-a098-0616e215e17d\") " Oct 03 15:51:13 crc kubenswrapper[4774]: I1003 15:51:13.454649 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d74de5f0-638a-49d7-a098-0616e215e17d-host" (OuterVolumeSpecName: "host") pod "d74de5f0-638a-49d7-a098-0616e215e17d" (UID: "d74de5f0-638a-49d7-a098-0616e215e17d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 15:51:13 crc kubenswrapper[4774]: I1003 15:51:13.454704 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cr9h6\" (UniqueName: \"kubernetes.io/projected/d74de5f0-638a-49d7-a098-0616e215e17d-kube-api-access-cr9h6\") pod \"d74de5f0-638a-49d7-a098-0616e215e17d\" (UID: \"d74de5f0-638a-49d7-a098-0616e215e17d\") " Oct 03 15:51:13 crc kubenswrapper[4774]: I1003 15:51:13.455155 4774 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d74de5f0-638a-49d7-a098-0616e215e17d-host\") on node \"crc\" DevicePath \"\"" Oct 03 15:51:13 crc kubenswrapper[4774]: I1003 15:51:13.474919 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d74de5f0-638a-49d7-a098-0616e215e17d-kube-api-access-cr9h6" (OuterVolumeSpecName: "kube-api-access-cr9h6") pod "d74de5f0-638a-49d7-a098-0616e215e17d" (UID: "d74de5f0-638a-49d7-a098-0616e215e17d"). InnerVolumeSpecName "kube-api-access-cr9h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:51:13 crc kubenswrapper[4774]: I1003 15:51:13.556529 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cr9h6\" (UniqueName: \"kubernetes.io/projected/d74de5f0-638a-49d7-a098-0616e215e17d-kube-api-access-cr9h6\") on node \"crc\" DevicePath \"\"" Oct 03 15:51:14 crc kubenswrapper[4774]: I1003 15:51:14.266432 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2968w/crc-debug-5l54k" event={"ID":"d74de5f0-638a-49d7-a098-0616e215e17d","Type":"ContainerDied","Data":"4e2d5677c830409b95ab073612736f8fbc16df63b06fa13a1ed8b4557b2d55da"} Oct 03 15:51:14 crc kubenswrapper[4774]: I1003 15:51:14.266497 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e2d5677c830409b95ab073612736f8fbc16df63b06fa13a1ed8b4557b2d55da" Oct 03 15:51:14 crc kubenswrapper[4774]: I1003 15:51:14.266563 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2968w/crc-debug-5l54k" Oct 03 15:51:14 crc kubenswrapper[4774]: E1003 15:51:14.373593 4774 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd74de5f0_638a_49d7_a098_0616e215e17d.slice/crio-4e2d5677c830409b95ab073612736f8fbc16df63b06fa13a1ed8b4557b2d55da\": RecentStats: unable to find data in memory cache]" Oct 03 15:51:18 crc kubenswrapper[4774]: I1003 15:51:18.489664 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2968w/crc-debug-5l54k"] Oct 03 15:51:18 crc kubenswrapper[4774]: I1003 15:51:18.497494 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2968w/crc-debug-5l54k"] Oct 03 15:51:19 crc kubenswrapper[4774]: I1003 15:51:19.322474 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d74de5f0-638a-49d7-a098-0616e215e17d" path="/var/lib/kubelet/pods/d74de5f0-638a-49d7-a098-0616e215e17d/volumes" Oct 03 15:51:19 crc kubenswrapper[4774]: I1003 15:51:19.641947 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2968w/crc-debug-2847t"] Oct 03 15:51:19 crc kubenswrapper[4774]: E1003 15:51:19.642338 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d74de5f0-638a-49d7-a098-0616e215e17d" containerName="container-00" Oct 03 15:51:19 crc kubenswrapper[4774]: I1003 15:51:19.642350 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="d74de5f0-638a-49d7-a098-0616e215e17d" containerName="container-00" Oct 03 15:51:19 crc kubenswrapper[4774]: I1003 15:51:19.642531 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="d74de5f0-638a-49d7-a098-0616e215e17d" containerName="container-00" Oct 03 15:51:19 crc kubenswrapper[4774]: I1003 15:51:19.643097 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2968w/crc-debug-2847t" Oct 03 15:51:19 crc kubenswrapper[4774]: I1003 15:51:19.645017 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-2968w"/"default-dockercfg-vcgvt" Oct 03 15:51:19 crc kubenswrapper[4774]: I1003 15:51:19.762235 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e165058e-74f7-4f25-9bf6-27f78528246d-host\") pod \"crc-debug-2847t\" (UID: \"e165058e-74f7-4f25-9bf6-27f78528246d\") " pod="openshift-must-gather-2968w/crc-debug-2847t" Oct 03 15:51:19 crc kubenswrapper[4774]: I1003 15:51:19.762451 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcdpp\" (UniqueName: \"kubernetes.io/projected/e165058e-74f7-4f25-9bf6-27f78528246d-kube-api-access-kcdpp\") pod \"crc-debug-2847t\" (UID: \"e165058e-74f7-4f25-9bf6-27f78528246d\") " pod="openshift-must-gather-2968w/crc-debug-2847t" Oct 03 15:51:19 crc kubenswrapper[4774]: I1003 15:51:19.865040 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcdpp\" (UniqueName: \"kubernetes.io/projected/e165058e-74f7-4f25-9bf6-27f78528246d-kube-api-access-kcdpp\") pod \"crc-debug-2847t\" (UID: \"e165058e-74f7-4f25-9bf6-27f78528246d\") " pod="openshift-must-gather-2968w/crc-debug-2847t" Oct 03 15:51:19 crc kubenswrapper[4774]: I1003 15:51:19.865252 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e165058e-74f7-4f25-9bf6-27f78528246d-host\") pod \"crc-debug-2847t\" (UID: \"e165058e-74f7-4f25-9bf6-27f78528246d\") " pod="openshift-must-gather-2968w/crc-debug-2847t" Oct 03 15:51:19 crc kubenswrapper[4774]: I1003 15:51:19.865369 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e165058e-74f7-4f25-9bf6-27f78528246d-host\") pod \"crc-debug-2847t\" (UID: \"e165058e-74f7-4f25-9bf6-27f78528246d\") " pod="openshift-must-gather-2968w/crc-debug-2847t" Oct 03 15:51:19 crc kubenswrapper[4774]: I1003 15:51:19.884675 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcdpp\" (UniqueName: \"kubernetes.io/projected/e165058e-74f7-4f25-9bf6-27f78528246d-kube-api-access-kcdpp\") pod \"crc-debug-2847t\" (UID: \"e165058e-74f7-4f25-9bf6-27f78528246d\") " pod="openshift-must-gather-2968w/crc-debug-2847t" Oct 03 15:51:19 crc kubenswrapper[4774]: I1003 15:51:19.971874 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2968w/crc-debug-2847t" Oct 03 15:51:20 crc kubenswrapper[4774]: I1003 15:51:20.329744 4774 generic.go:334] "Generic (PLEG): container finished" podID="e165058e-74f7-4f25-9bf6-27f78528246d" containerID="356974452cbf2a99938e87e1e11d300153ef50685d743910a5af56a3a1e1f73b" exitCode=0 Oct 03 15:51:20 crc kubenswrapper[4774]: I1003 15:51:20.329830 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2968w/crc-debug-2847t" event={"ID":"e165058e-74f7-4f25-9bf6-27f78528246d","Type":"ContainerDied","Data":"356974452cbf2a99938e87e1e11d300153ef50685d743910a5af56a3a1e1f73b"} Oct 03 15:51:20 crc kubenswrapper[4774]: I1003 15:51:20.330075 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2968w/crc-debug-2847t" event={"ID":"e165058e-74f7-4f25-9bf6-27f78528246d","Type":"ContainerStarted","Data":"fbf3ed53e8d34f5392e832177939118b3d5adcbffdc6cc424b6fa0aafebd0c13"} Oct 03 15:51:20 crc kubenswrapper[4774]: I1003 15:51:20.369772 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2968w/crc-debug-2847t"] Oct 03 15:51:20 crc kubenswrapper[4774]: I1003 15:51:20.378622 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2968w/crc-debug-2847t"] Oct 03 15:51:21 crc kubenswrapper[4774]: I1003 15:51:21.453163 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2968w/crc-debug-2847t" Oct 03 15:51:21 crc kubenswrapper[4774]: I1003 15:51:21.595881 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e165058e-74f7-4f25-9bf6-27f78528246d-host\") pod \"e165058e-74f7-4f25-9bf6-27f78528246d\" (UID: \"e165058e-74f7-4f25-9bf6-27f78528246d\") " Oct 03 15:51:21 crc kubenswrapper[4774]: I1003 15:51:21.595982 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcdpp\" (UniqueName: \"kubernetes.io/projected/e165058e-74f7-4f25-9bf6-27f78528246d-kube-api-access-kcdpp\") pod \"e165058e-74f7-4f25-9bf6-27f78528246d\" (UID: \"e165058e-74f7-4f25-9bf6-27f78528246d\") " Oct 03 15:51:21 crc kubenswrapper[4774]: I1003 15:51:21.596009 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e165058e-74f7-4f25-9bf6-27f78528246d-host" (OuterVolumeSpecName: "host") pod "e165058e-74f7-4f25-9bf6-27f78528246d" (UID: "e165058e-74f7-4f25-9bf6-27f78528246d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 15:51:21 crc kubenswrapper[4774]: I1003 15:51:21.596501 4774 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e165058e-74f7-4f25-9bf6-27f78528246d-host\") on node \"crc\" DevicePath \"\"" Oct 03 15:51:21 crc kubenswrapper[4774]: I1003 15:51:21.604594 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e165058e-74f7-4f25-9bf6-27f78528246d-kube-api-access-kcdpp" (OuterVolumeSpecName: "kube-api-access-kcdpp") pod "e165058e-74f7-4f25-9bf6-27f78528246d" (UID: "e165058e-74f7-4f25-9bf6-27f78528246d"). InnerVolumeSpecName "kube-api-access-kcdpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:51:21 crc kubenswrapper[4774]: I1003 15:51:21.698165 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcdpp\" (UniqueName: \"kubernetes.io/projected/e165058e-74f7-4f25-9bf6-27f78528246d-kube-api-access-kcdpp\") on node \"crc\" DevicePath \"\"" Oct 03 15:51:21 crc kubenswrapper[4774]: I1003 15:51:21.974201 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6c675fb79f-b54rk_cea60414-e959-4200-b3e5-e532d2136047/manager/0.log" Oct 03 15:51:22 crc kubenswrapper[4774]: I1003 15:51:22.012872 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6c675fb79f-b54rk_cea60414-e959-4200-b3e5-e532d2136047/kube-rbac-proxy/0.log" Oct 03 15:51:22 crc kubenswrapper[4774]: I1003 15:51:22.207615 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79d68d6c85-6gckj_4c3d1495-6568-44c2-9bd7-82256a4b5aab/kube-rbac-proxy/0.log" Oct 03 15:51:22 crc kubenswrapper[4774]: I1003 15:51:22.238840 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79d68d6c85-6gckj_4c3d1495-6568-44c2-9bd7-82256a4b5aab/manager/0.log" Oct 03 15:51:22 crc kubenswrapper[4774]: I1003 15:51:22.346913 4774 scope.go:117] "RemoveContainer" containerID="356974452cbf2a99938e87e1e11d300153ef50685d743910a5af56a3a1e1f73b" Oct 03 15:51:22 crc kubenswrapper[4774]: I1003 15:51:22.346942 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2968w/crc-debug-2847t" Oct 03 15:51:22 crc kubenswrapper[4774]: I1003 15:51:22.486950 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-m2wmk_4cf018fb-edab-4e23-ad04-763ee25e1613/manager/0.log" Oct 03 15:51:22 crc kubenswrapper[4774]: I1003 15:51:22.509831 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh_128ae2e6-72c5-44ef-bf0a-6f54d80796cd/util/0.log" Oct 03 15:51:22 crc kubenswrapper[4774]: I1003 15:51:22.518194 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-m2wmk_4cf018fb-edab-4e23-ad04-763ee25e1613/kube-rbac-proxy/0.log" Oct 03 15:51:22 crc kubenswrapper[4774]: I1003 15:51:22.750017 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh_128ae2e6-72c5-44ef-bf0a-6f54d80796cd/pull/0.log" Oct 03 15:51:22 crc kubenswrapper[4774]: I1003 15:51:22.781447 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh_128ae2e6-72c5-44ef-bf0a-6f54d80796cd/util/0.log" Oct 03 15:51:22 crc kubenswrapper[4774]: I1003 15:51:22.810163 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh_128ae2e6-72c5-44ef-bf0a-6f54d80796cd/pull/0.log" Oct 03 15:51:22 crc kubenswrapper[4774]: I1003 15:51:22.977025 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh_128ae2e6-72c5-44ef-bf0a-6f54d80796cd/util/0.log" Oct 03 15:51:23 crc kubenswrapper[4774]: I1003 15:51:23.003415 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh_128ae2e6-72c5-44ef-bf0a-6f54d80796cd/pull/0.log" Oct 03 15:51:23 crc kubenswrapper[4774]: I1003 15:51:23.024648 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ffa7a32710a5aeff5132a8f99a05f958970cc49636113bcf9efa984382smfsh_128ae2e6-72c5-44ef-bf0a-6f54d80796cd/extract/0.log" Oct 03 15:51:23 crc kubenswrapper[4774]: I1003 15:51:23.166075 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-846dff85b5-v5kkx_29905a36-139e-4611-bc8e-0289dd1fa0b4/kube-rbac-proxy/0.log" Oct 03 15:51:23 crc kubenswrapper[4774]: I1003 15:51:23.314918 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e165058e-74f7-4f25-9bf6-27f78528246d" path="/var/lib/kubelet/pods/e165058e-74f7-4f25-9bf6-27f78528246d/volumes" Oct 03 15:51:23 crc kubenswrapper[4774]: I1003 15:51:23.317909 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-599898f689-mhz7w_ecf9de8d-83d4-41cb-9b00-e3aeedfb93fd/kube-rbac-proxy/0.log" Oct 03 15:51:23 crc kubenswrapper[4774]: I1003 15:51:23.326772 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-846dff85b5-v5kkx_29905a36-139e-4611-bc8e-0289dd1fa0b4/manager/0.log" Oct 03 15:51:23 crc kubenswrapper[4774]: I1003 15:51:23.379293 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-599898f689-mhz7w_ecf9de8d-83d4-41cb-9b00-e3aeedfb93fd/manager/0.log" Oct 03 15:51:23 crc kubenswrapper[4774]: I1003 15:51:23.515217 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6769b867d9-2tqv8_02bdd1b6-4d8f-40ce-b0fe-449c738d5d0e/kube-rbac-proxy/0.log" Oct 03 15:51:23 crc kubenswrapper[4774]: I1003 15:51:23.550900 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6769b867d9-2tqv8_02bdd1b6-4d8f-40ce-b0fe-449c738d5d0e/manager/0.log" Oct 03 15:51:23 crc kubenswrapper[4774]: I1003 15:51:23.720976 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5fbf469cd7-zhmr8_bc3a311f-a6c2-40e4-aaae-549aa2395c57/kube-rbac-proxy/0.log" Oct 03 15:51:23 crc kubenswrapper[4774]: I1003 15:51:23.794622 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-84bc9db6cc-2sg6f_741c17eb-da65-4dce-abc6-7faa47d28004/kube-rbac-proxy/0.log" Oct 03 15:51:23 crc kubenswrapper[4774]: I1003 15:51:23.845215 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5fbf469cd7-zhmr8_bc3a311f-a6c2-40e4-aaae-549aa2395c57/manager/0.log" Oct 03 15:51:23 crc kubenswrapper[4774]: I1003 15:51:23.929287 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-84bc9db6cc-2sg6f_741c17eb-da65-4dce-abc6-7faa47d28004/manager/0.log" Oct 03 15:51:24 crc kubenswrapper[4774]: I1003 15:51:24.003223 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7f55849f88-fw4zb_b32ca090-1129-4c77-a2b3-df9e51a35a48/kube-rbac-proxy/0.log" Oct 03 15:51:24 crc kubenswrapper[4774]: I1003 15:51:24.095041 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7f55849f88-fw4zb_b32ca090-1129-4c77-a2b3-df9e51a35a48/manager/0.log" Oct 03 15:51:24 crc kubenswrapper[4774]: I1003 15:51:24.329275 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6fd6854b49-24bvc_10674e8a-5afd-45f7-af36-e9dbfaf2dba0/kube-rbac-proxy/0.log" Oct 03 15:51:24 crc kubenswrapper[4774]: I1003 15:51:24.353187 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6fd6854b49-24bvc_10674e8a-5afd-45f7-af36-e9dbfaf2dba0/manager/0.log" Oct 03 15:51:24 crc kubenswrapper[4774]: I1003 15:51:24.831701 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5c468bf4d4-mtjh7_6edea7f2-581f-4f41-bdda-45e83dce680d/kube-rbac-proxy/0.log" Oct 03 15:51:24 crc kubenswrapper[4774]: I1003 15:51:24.858106 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5c468bf4d4-mtjh7_6edea7f2-581f-4f41-bdda-45e83dce680d/manager/0.log" Oct 03 15:51:24 crc kubenswrapper[4774]: I1003 15:51:24.867788 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6574bf987d-nw7mh_eaf48bda-c7ca-484b-8d8f-b195d011e8f9/kube-rbac-proxy/0.log" Oct 03 15:51:25 crc kubenswrapper[4774]: I1003 15:51:25.055684 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6574bf987d-nw7mh_eaf48bda-c7ca-484b-8d8f-b195d011e8f9/manager/0.log" Oct 03 15:51:25 crc kubenswrapper[4774]: I1003 15:51:25.105650 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-555c7456bd-fhd72_74855628-79e3-4300-8a8b-d05aeed1904b/kube-rbac-proxy/0.log" Oct 03 15:51:25 crc kubenswrapper[4774]: I1003 15:51:25.207184 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-555c7456bd-fhd72_74855628-79e3-4300-8a8b-d05aeed1904b/manager/0.log" Oct 03 15:51:25 crc kubenswrapper[4774]: I1003 15:51:25.320324 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-59d6cfdf45-84vwj_5c00d52a-acc5-4650-8b36-48faa90030a3/kube-rbac-proxy/0.log" Oct 03 15:51:25 crc kubenswrapper[4774]: I1003 15:51:25.372367 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-59d6cfdf45-84vwj_5c00d52a-acc5-4650-8b36-48faa90030a3/manager/0.log" Oct 03 15:51:25 crc kubenswrapper[4774]: I1003 15:51:25.532727 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6f64c4d678chmhg_2b82b2ba-da6d-4441-a194-4b47207b159a/manager/0.log" Oct 03 15:51:25 crc kubenswrapper[4774]: I1003 15:51:25.536453 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6f64c4d678chmhg_2b82b2ba-da6d-4441-a194-4b47207b159a/kube-rbac-proxy/0.log" Oct 03 15:51:25 crc kubenswrapper[4774]: I1003 15:51:25.649338 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6977957f88-8kmrq_73dd9462-cd4d-40d8-a416-c8ed1ef328fb/kube-rbac-proxy/0.log" Oct 03 15:51:25 crc kubenswrapper[4774]: I1003 15:51:25.750299 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7c89b76849-9klgg_4ad90ede-158d-4798-a2d5-399d61654604/kube-rbac-proxy/0.log" Oct 03 15:51:25 crc kubenswrapper[4774]: I1003 15:51:25.944913 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-2t96w_a97efab2-9188-4828-a600-d346b724f1f9/registry-server/0.log" Oct 03 15:51:25 crc kubenswrapper[4774]: I1003 15:51:25.966921 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7c89b76849-9klgg_4ad90ede-158d-4798-a2d5-399d61654604/operator/0.log" Oct 03 15:51:26 crc kubenswrapper[4774]: I1003 15:51:26.171597 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-688db7b6c7-c7hxr_14fc26c3-ab56-44a5-832c-55eaca43cc5c/kube-rbac-proxy/0.log" Oct 03 15:51:26 crc kubenswrapper[4774]: I1003 15:51:26.200191 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-7d8bb7f44c-pqrxt_35aefd42-2274-451a-8526-fb99c1f72be0/kube-rbac-proxy/0.log" Oct 03 15:51:26 crc kubenswrapper[4774]: I1003 15:51:26.225619 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-688db7b6c7-c7hxr_14fc26c3-ab56-44a5-832c-55eaca43cc5c/manager/0.log" Oct 03 15:51:26 crc kubenswrapper[4774]: I1003 15:51:26.759179 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6977957f88-8kmrq_73dd9462-cd4d-40d8-a416-c8ed1ef328fb/manager/0.log" Oct 03 15:51:26 crc kubenswrapper[4774]: I1003 15:51:26.880261 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-xqmql_77b9687d-958c-47ad-835e-160fc6214d72/operator/0.log" Oct 03 15:51:26 crc kubenswrapper[4774]: I1003 15:51:26.880661 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-7d8bb7f44c-pqrxt_35aefd42-2274-451a-8526-fb99c1f72be0/manager/0.log" Oct 03 15:51:26 crc kubenswrapper[4774]: I1003 15:51:26.982119 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-j5d2d_e9d1e188-b3ff-4807-a57e-9bf290e10f22/kube-rbac-proxy/0.log" Oct 03 15:51:27 crc kubenswrapper[4774]: I1003 15:51:27.068217 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-j5d2d_e9d1e188-b3ff-4807-a57e-9bf290e10f22/manager/0.log" Oct 03 15:51:27 crc kubenswrapper[4774]: I1003 15:51:27.136233 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5db5cf686f-zpq8n_fcb0af6a-547c-4555-86f9-f0b390ae7ce3/kube-rbac-proxy/0.log" Oct 03 15:51:27 crc kubenswrapper[4774]: I1003 15:51:27.194720 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5db5cf686f-zpq8n_fcb0af6a-547c-4555-86f9-f0b390ae7ce3/manager/0.log" Oct 03 15:51:27 crc kubenswrapper[4774]: I1003 15:51:27.271044 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-ftzfn_6f1973d7-94ab-4855-bfd4-91f1e677306f/kube-rbac-proxy/0.log" Oct 03 15:51:27 crc kubenswrapper[4774]: I1003 15:51:27.324317 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-ftzfn_6f1973d7-94ab-4855-bfd4-91f1e677306f/manager/0.log" Oct 03 15:51:27 crc kubenswrapper[4774]: I1003 15:51:27.383731 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-fcd7d9895-ngbtk_b8324f27-b72f-4ad9-adcb-82469098520a/kube-rbac-proxy/0.log" Oct 03 15:51:27 crc kubenswrapper[4774]: I1003 15:51:27.387417 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-fcd7d9895-ngbtk_b8324f27-b72f-4ad9-adcb-82469098520a/manager/0.log" Oct 03 15:51:27 crc kubenswrapper[4774]: I1003 15:51:27.626117 4774 scope.go:117] "RemoveContainer" containerID="46fdfa35a21fd9aa7830c4e5f5abd673fa78a8fb0992e9575101e6911ad019a8" Oct 03 15:51:44 crc kubenswrapper[4774]: I1003 15:51:44.403040 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-7qxcv_98cf7a02-ed29-4cd4-9f60-77659f186e4b/control-plane-machine-set-operator/0.log" Oct 03 15:51:44 crc kubenswrapper[4774]: I1003 15:51:44.557242 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7nrwv_b5e417ed-5b5e-405b-8b95-ed27ddaef9ee/kube-rbac-proxy/0.log" Oct 03 15:51:44 crc kubenswrapper[4774]: I1003 15:51:44.595607 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7nrwv_b5e417ed-5b5e-405b-8b95-ed27ddaef9ee/machine-api-operator/0.log" Oct 03 15:51:57 crc kubenswrapper[4774]: I1003 15:51:57.000321 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-h2fts_769d7391-5628-4bcd-af8d-accf8b37c400/cert-manager-controller/0.log" Oct 03 15:51:57 crc kubenswrapper[4774]: I1003 15:51:57.192839 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-gghpw_1f99341f-994f-496c-9287-f0fa80429b74/cert-manager-cainjector/0.log" Oct 03 15:51:57 crc kubenswrapper[4774]: I1003 15:51:57.214567 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-hggn6_3d3ba37a-f1af-431c-a733-19283fc5c055/cert-manager-webhook/0.log" Oct 03 15:52:10 crc kubenswrapper[4774]: I1003 15:52:10.313478 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-p8hlt_e1390648-f53e-4c1c-a801-2c76be9fa959/nmstate-console-plugin/0.log" Oct 03 15:52:10 crc kubenswrapper[4774]: I1003 15:52:10.472166 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-bc8h5_6aaa4a80-d658-4f5a-8b6c-cc84ad781c64/nmstate-handler/0.log" Oct 03 15:52:10 crc kubenswrapper[4774]: I1003 15:52:10.486532 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-4lb7n_bd99f12e-3622-42d7-bece-2149da359b49/kube-rbac-proxy/0.log" Oct 03 15:52:10 crc kubenswrapper[4774]: I1003 15:52:10.517465 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-4lb7n_bd99f12e-3622-42d7-bece-2149da359b49/nmstate-metrics/0.log" Oct 03 15:52:10 crc kubenswrapper[4774]: I1003 15:52:10.669613 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-59jwx_a0e3f6b9-0e1a-4627-aad6-b6252f7f4ab9/nmstate-operator/0.log" Oct 03 15:52:10 crc kubenswrapper[4774]: I1003 15:52:10.721100 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-hrzkg_e582dd92-97d1-48bf-a81b-ad144d0a89cf/nmstate-webhook/0.log" Oct 03 15:52:19 crc kubenswrapper[4774]: I1003 15:52:19.007395 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pvshz"] Oct 03 15:52:19 crc kubenswrapper[4774]: E1003 15:52:19.008456 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e165058e-74f7-4f25-9bf6-27f78528246d" containerName="container-00" Oct 03 15:52:19 crc kubenswrapper[4774]: I1003 15:52:19.008473 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e165058e-74f7-4f25-9bf6-27f78528246d" containerName="container-00" Oct 03 15:52:19 crc kubenswrapper[4774]: I1003 15:52:19.008698 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="e165058e-74f7-4f25-9bf6-27f78528246d" containerName="container-00" Oct 03 15:52:19 crc kubenswrapper[4774]: I1003 15:52:19.010300 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pvshz" Oct 03 15:52:19 crc kubenswrapper[4774]: I1003 15:52:19.024235 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvshz"] Oct 03 15:52:19 crc kubenswrapper[4774]: I1003 15:52:19.191527 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd90a524-88f5-4a73-bef4-48ad019e134c-utilities\") pod \"redhat-marketplace-pvshz\" (UID: \"bd90a524-88f5-4a73-bef4-48ad019e134c\") " pod="openshift-marketplace/redhat-marketplace-pvshz" Oct 03 15:52:19 crc kubenswrapper[4774]: I1003 15:52:19.191642 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4k6m\" (UniqueName: \"kubernetes.io/projected/bd90a524-88f5-4a73-bef4-48ad019e134c-kube-api-access-l4k6m\") pod \"redhat-marketplace-pvshz\" (UID: \"bd90a524-88f5-4a73-bef4-48ad019e134c\") " pod="openshift-marketplace/redhat-marketplace-pvshz" Oct 03 15:52:19 crc kubenswrapper[4774]: I1003 15:52:19.191695 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd90a524-88f5-4a73-bef4-48ad019e134c-catalog-content\") pod \"redhat-marketplace-pvshz\" (UID: \"bd90a524-88f5-4a73-bef4-48ad019e134c\") " pod="openshift-marketplace/redhat-marketplace-pvshz" Oct 03 15:52:19 crc kubenswrapper[4774]: I1003 15:52:19.293674 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4k6m\" (UniqueName: \"kubernetes.io/projected/bd90a524-88f5-4a73-bef4-48ad019e134c-kube-api-access-l4k6m\") pod \"redhat-marketplace-pvshz\" (UID: \"bd90a524-88f5-4a73-bef4-48ad019e134c\") " pod="openshift-marketplace/redhat-marketplace-pvshz" Oct 03 15:52:19 crc kubenswrapper[4774]: I1003 15:52:19.293744 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd90a524-88f5-4a73-bef4-48ad019e134c-catalog-content\") pod \"redhat-marketplace-pvshz\" (UID: \"bd90a524-88f5-4a73-bef4-48ad019e134c\") " pod="openshift-marketplace/redhat-marketplace-pvshz" Oct 03 15:52:19 crc kubenswrapper[4774]: I1003 15:52:19.293932 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd90a524-88f5-4a73-bef4-48ad019e134c-utilities\") pod \"redhat-marketplace-pvshz\" (UID: \"bd90a524-88f5-4a73-bef4-48ad019e134c\") " pod="openshift-marketplace/redhat-marketplace-pvshz" Oct 03 15:52:19 crc kubenswrapper[4774]: I1003 15:52:19.294577 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd90a524-88f5-4a73-bef4-48ad019e134c-catalog-content\") pod \"redhat-marketplace-pvshz\" (UID: \"bd90a524-88f5-4a73-bef4-48ad019e134c\") " pod="openshift-marketplace/redhat-marketplace-pvshz" Oct 03 15:52:19 crc kubenswrapper[4774]: I1003 15:52:19.294588 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd90a524-88f5-4a73-bef4-48ad019e134c-utilities\") pod \"redhat-marketplace-pvshz\" (UID: \"bd90a524-88f5-4a73-bef4-48ad019e134c\") " pod="openshift-marketplace/redhat-marketplace-pvshz" Oct 03 15:52:19 crc kubenswrapper[4774]: I1003 15:52:19.324127 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4k6m\" (UniqueName: \"kubernetes.io/projected/bd90a524-88f5-4a73-bef4-48ad019e134c-kube-api-access-l4k6m\") pod \"redhat-marketplace-pvshz\" (UID: \"bd90a524-88f5-4a73-bef4-48ad019e134c\") " pod="openshift-marketplace/redhat-marketplace-pvshz" Oct 03 15:52:19 crc kubenswrapper[4774]: I1003 15:52:19.348479 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pvshz" Oct 03 15:52:19 crc kubenswrapper[4774]: I1003 15:52:19.782501 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvshz"] Oct 03 15:52:19 crc kubenswrapper[4774]: I1003 15:52:19.881239 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvshz" event={"ID":"bd90a524-88f5-4a73-bef4-48ad019e134c","Type":"ContainerStarted","Data":"e393a39a39ff86827e6f3181974026d2623bb1eb57afc3a33d273f87d8994c68"} Oct 03 15:52:20 crc kubenswrapper[4774]: I1003 15:52:20.916136 4774 generic.go:334] "Generic (PLEG): container finished" podID="bd90a524-88f5-4a73-bef4-48ad019e134c" containerID="536fe57b48f3ba5f9bb63072721410133e5c02ed0c5083085ed2ebcae4a3413e" exitCode=0 Oct 03 15:52:20 crc kubenswrapper[4774]: I1003 15:52:20.916235 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvshz" event={"ID":"bd90a524-88f5-4a73-bef4-48ad019e134c","Type":"ContainerDied","Data":"536fe57b48f3ba5f9bb63072721410133e5c02ed0c5083085ed2ebcae4a3413e"} Oct 03 15:52:22 crc kubenswrapper[4774]: I1003 15:52:22.936056 4774 generic.go:334] "Generic (PLEG): container finished" podID="bd90a524-88f5-4a73-bef4-48ad019e134c" containerID="7bb6b0a0b6ac08ba371e40142f1e6035db7c045761bdfc265371d8a23128dbe7" exitCode=0 Oct 03 15:52:22 crc kubenswrapper[4774]: I1003 15:52:22.936127 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvshz" event={"ID":"bd90a524-88f5-4a73-bef4-48ad019e134c","Type":"ContainerDied","Data":"7bb6b0a0b6ac08ba371e40142f1e6035db7c045761bdfc265371d8a23128dbe7"} Oct 03 15:52:24 crc kubenswrapper[4774]: I1003 15:52:24.954165 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvshz" event={"ID":"bd90a524-88f5-4a73-bef4-48ad019e134c","Type":"ContainerStarted","Data":"6d20412534050e783c36587e54ab4d591e70c916764eede262e9aabaf6c2c4d4"} Oct 03 15:52:24 crc kubenswrapper[4774]: I1003 15:52:24.988968 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pvshz" podStartSLOduration=3.369858736 podStartE2EDuration="6.988949646s" podCreationTimestamp="2025-10-03 15:52:18 +0000 UTC" firstStartedPulling="2025-10-03 15:52:20.918811869 +0000 UTC m=+4163.508015321" lastFinishedPulling="2025-10-03 15:52:24.537902779 +0000 UTC m=+4167.127106231" observedRunningTime="2025-10-03 15:52:24.980796234 +0000 UTC m=+4167.569999686" watchObservedRunningTime="2025-10-03 15:52:24.988949646 +0000 UTC m=+4167.578153098" Oct 03 15:52:25 crc kubenswrapper[4774]: I1003 15:52:25.071478 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-qp2nq_625d3121-98b6-42e6-bc58-ea4bbdc5a7ad/kube-rbac-proxy/0.log" Oct 03 15:52:25 crc kubenswrapper[4774]: I1003 15:52:25.078917 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-qp2nq_625d3121-98b6-42e6-bc58-ea4bbdc5a7ad/controller/0.log" Oct 03 15:52:25 crc kubenswrapper[4774]: I1003 15:52:25.283526 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/cp-frr-files/0.log" Oct 03 15:52:25 crc kubenswrapper[4774]: I1003 15:52:25.554570 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/cp-frr-files/0.log" Oct 03 15:52:25 crc kubenswrapper[4774]: I1003 15:52:25.576035 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/cp-reloader/0.log" Oct 03 15:52:25 crc kubenswrapper[4774]: I1003 15:52:25.600440 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/cp-metrics/0.log" Oct 03 15:52:25 crc kubenswrapper[4774]: I1003 15:52:25.628526 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/cp-reloader/0.log" Oct 03 15:52:25 crc kubenswrapper[4774]: I1003 15:52:25.810116 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/cp-metrics/0.log" Oct 03 15:52:25 crc kubenswrapper[4774]: I1003 15:52:25.810698 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/cp-reloader/0.log" Oct 03 15:52:25 crc kubenswrapper[4774]: I1003 15:52:25.821756 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/cp-frr-files/0.log" Oct 03 15:52:25 crc kubenswrapper[4774]: I1003 15:52:25.853640 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/cp-metrics/0.log" Oct 03 15:52:25 crc kubenswrapper[4774]: I1003 15:52:25.992622 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/cp-metrics/0.log" Oct 03 15:52:26 crc kubenswrapper[4774]: I1003 15:52:26.025131 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/cp-frr-files/0.log" Oct 03 15:52:26 crc kubenswrapper[4774]: I1003 15:52:26.048837 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/cp-reloader/0.log" Oct 03 15:52:26 crc kubenswrapper[4774]: I1003 15:52:26.109261 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/controller/0.log" Oct 03 15:52:26 crc kubenswrapper[4774]: I1003 15:52:26.255760 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/frr-metrics/0.log" Oct 03 15:52:26 crc kubenswrapper[4774]: I1003 15:52:26.334188 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/kube-rbac-proxy/0.log" Oct 03 15:52:26 crc kubenswrapper[4774]: I1003 15:52:26.374773 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/kube-rbac-proxy-frr/0.log" Oct 03 15:52:26 crc kubenswrapper[4774]: I1003 15:52:26.463216 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/reloader/0.log" Oct 03 15:52:26 crc kubenswrapper[4774]: I1003 15:52:26.585546 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-dtqfw_9c555a1f-be66-4efe-81ed-d2d90bd5e2f7/frr-k8s-webhook-server/0.log" Oct 03 15:52:26 crc kubenswrapper[4774]: I1003 15:52:26.852151 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7f6cc8bc96-sdmhf_87f70971-8510-4214-86dc-011aaf626b7a/manager/0.log" Oct 03 15:52:26 crc kubenswrapper[4774]: I1003 15:52:26.916835 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-696d9855d4-wkrd4_637291ee-7c78-4978-a59f-da3c5d284724/webhook-server/0.log" Oct 03 15:52:27 crc kubenswrapper[4774]: I1003 15:52:27.155084 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-xfjwm_e5a0c71d-7887-4a39-b427-221389fecc1e/kube-rbac-proxy/0.log" Oct 03 15:52:27 crc kubenswrapper[4774]: I1003 15:52:27.587753 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vt9fs_cf48682e-2440-425a-bbd5-ebc1597e265d/frr/0.log" Oct 03 15:52:27 crc kubenswrapper[4774]: I1003 15:52:27.688604 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-xfjwm_e5a0c71d-7887-4a39-b427-221389fecc1e/speaker/0.log" Oct 03 15:52:29 crc kubenswrapper[4774]: I1003 15:52:29.348717 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pvshz" Oct 03 15:52:29 crc kubenswrapper[4774]: I1003 15:52:29.350010 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pvshz" Oct 03 15:52:29 crc kubenswrapper[4774]: I1003 15:52:29.398341 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pvshz" Oct 03 15:52:30 crc kubenswrapper[4774]: I1003 15:52:30.044957 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pvshz" Oct 03 15:52:30 crc kubenswrapper[4774]: I1003 15:52:30.103359 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvshz"] Oct 03 15:52:32 crc kubenswrapper[4774]: I1003 15:52:32.009483 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pvshz" podUID="bd90a524-88f5-4a73-bef4-48ad019e134c" containerName="registry-server" containerID="cri-o://6d20412534050e783c36587e54ab4d591e70c916764eede262e9aabaf6c2c4d4" gracePeriod=2 Oct 03 15:52:32 crc kubenswrapper[4774]: I1003 15:52:32.487610 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pvshz" Oct 03 15:52:32 crc kubenswrapper[4774]: I1003 15:52:32.548030 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4k6m\" (UniqueName: \"kubernetes.io/projected/bd90a524-88f5-4a73-bef4-48ad019e134c-kube-api-access-l4k6m\") pod \"bd90a524-88f5-4a73-bef4-48ad019e134c\" (UID: \"bd90a524-88f5-4a73-bef4-48ad019e134c\") " Oct 03 15:52:32 crc kubenswrapper[4774]: I1003 15:52:32.548156 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd90a524-88f5-4a73-bef4-48ad019e134c-catalog-content\") pod \"bd90a524-88f5-4a73-bef4-48ad019e134c\" (UID: \"bd90a524-88f5-4a73-bef4-48ad019e134c\") " Oct 03 15:52:32 crc kubenswrapper[4774]: I1003 15:52:32.548192 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd90a524-88f5-4a73-bef4-48ad019e134c-utilities\") pod \"bd90a524-88f5-4a73-bef4-48ad019e134c\" (UID: \"bd90a524-88f5-4a73-bef4-48ad019e134c\") " Oct 03 15:52:32 crc kubenswrapper[4774]: I1003 15:52:32.549253 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd90a524-88f5-4a73-bef4-48ad019e134c-utilities" (OuterVolumeSpecName: "utilities") pod "bd90a524-88f5-4a73-bef4-48ad019e134c" (UID: "bd90a524-88f5-4a73-bef4-48ad019e134c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:52:32 crc kubenswrapper[4774]: I1003 15:52:32.554746 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd90a524-88f5-4a73-bef4-48ad019e134c-kube-api-access-l4k6m" (OuterVolumeSpecName: "kube-api-access-l4k6m") pod "bd90a524-88f5-4a73-bef4-48ad019e134c" (UID: "bd90a524-88f5-4a73-bef4-48ad019e134c"). InnerVolumeSpecName "kube-api-access-l4k6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:52:32 crc kubenswrapper[4774]: I1003 15:52:32.569232 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd90a524-88f5-4a73-bef4-48ad019e134c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bd90a524-88f5-4a73-bef4-48ad019e134c" (UID: "bd90a524-88f5-4a73-bef4-48ad019e134c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:52:32 crc kubenswrapper[4774]: I1003 15:52:32.650411 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4k6m\" (UniqueName: \"kubernetes.io/projected/bd90a524-88f5-4a73-bef4-48ad019e134c-kube-api-access-l4k6m\") on node \"crc\" DevicePath \"\"" Oct 03 15:52:32 crc kubenswrapper[4774]: I1003 15:52:32.650444 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd90a524-88f5-4a73-bef4-48ad019e134c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 15:52:32 crc kubenswrapper[4774]: I1003 15:52:32.650454 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd90a524-88f5-4a73-bef4-48ad019e134c-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 15:52:33 crc kubenswrapper[4774]: I1003 15:52:33.030926 4774 generic.go:334] "Generic (PLEG): container finished" podID="bd90a524-88f5-4a73-bef4-48ad019e134c" containerID="6d20412534050e783c36587e54ab4d591e70c916764eede262e9aabaf6c2c4d4" exitCode=0 Oct 03 15:52:33 crc kubenswrapper[4774]: I1003 15:52:33.031005 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvshz" event={"ID":"bd90a524-88f5-4a73-bef4-48ad019e134c","Type":"ContainerDied","Data":"6d20412534050e783c36587e54ab4d591e70c916764eede262e9aabaf6c2c4d4"} Oct 03 15:52:33 crc kubenswrapper[4774]: I1003 15:52:33.032905 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvshz" event={"ID":"bd90a524-88f5-4a73-bef4-48ad019e134c","Type":"ContainerDied","Data":"e393a39a39ff86827e6f3181974026d2623bb1eb57afc3a33d273f87d8994c68"} Oct 03 15:52:33 crc kubenswrapper[4774]: I1003 15:52:33.031037 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pvshz" Oct 03 15:52:33 crc kubenswrapper[4774]: I1003 15:52:33.032972 4774 scope.go:117] "RemoveContainer" containerID="6d20412534050e783c36587e54ab4d591e70c916764eede262e9aabaf6c2c4d4" Oct 03 15:52:33 crc kubenswrapper[4774]: I1003 15:52:33.057846 4774 scope.go:117] "RemoveContainer" containerID="7bb6b0a0b6ac08ba371e40142f1e6035db7c045761bdfc265371d8a23128dbe7" Oct 03 15:52:33 crc kubenswrapper[4774]: I1003 15:52:33.101157 4774 scope.go:117] "RemoveContainer" containerID="536fe57b48f3ba5f9bb63072721410133e5c02ed0c5083085ed2ebcae4a3413e" Oct 03 15:52:33 crc kubenswrapper[4774]: I1003 15:52:33.104716 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvshz"] Oct 03 15:52:33 crc kubenswrapper[4774]: I1003 15:52:33.119859 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvshz"] Oct 03 15:52:33 crc kubenswrapper[4774]: I1003 15:52:33.134966 4774 scope.go:117] "RemoveContainer" containerID="6d20412534050e783c36587e54ab4d591e70c916764eede262e9aabaf6c2c4d4" Oct 03 15:52:33 crc kubenswrapper[4774]: E1003 15:52:33.135594 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d20412534050e783c36587e54ab4d591e70c916764eede262e9aabaf6c2c4d4\": container with ID starting with 6d20412534050e783c36587e54ab4d591e70c916764eede262e9aabaf6c2c4d4 not found: ID does not exist" containerID="6d20412534050e783c36587e54ab4d591e70c916764eede262e9aabaf6c2c4d4" Oct 03 15:52:33 crc kubenswrapper[4774]: I1003 15:52:33.135625 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d20412534050e783c36587e54ab4d591e70c916764eede262e9aabaf6c2c4d4"} err="failed to get container status \"6d20412534050e783c36587e54ab4d591e70c916764eede262e9aabaf6c2c4d4\": rpc error: code = NotFound desc = could not find container \"6d20412534050e783c36587e54ab4d591e70c916764eede262e9aabaf6c2c4d4\": container with ID starting with 6d20412534050e783c36587e54ab4d591e70c916764eede262e9aabaf6c2c4d4 not found: ID does not exist" Oct 03 15:52:33 crc kubenswrapper[4774]: I1003 15:52:33.135648 4774 scope.go:117] "RemoveContainer" containerID="7bb6b0a0b6ac08ba371e40142f1e6035db7c045761bdfc265371d8a23128dbe7" Oct 03 15:52:33 crc kubenswrapper[4774]: E1003 15:52:33.136486 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bb6b0a0b6ac08ba371e40142f1e6035db7c045761bdfc265371d8a23128dbe7\": container with ID starting with 7bb6b0a0b6ac08ba371e40142f1e6035db7c045761bdfc265371d8a23128dbe7 not found: ID does not exist" containerID="7bb6b0a0b6ac08ba371e40142f1e6035db7c045761bdfc265371d8a23128dbe7" Oct 03 15:52:33 crc kubenswrapper[4774]: I1003 15:52:33.136525 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bb6b0a0b6ac08ba371e40142f1e6035db7c045761bdfc265371d8a23128dbe7"} err="failed to get container status \"7bb6b0a0b6ac08ba371e40142f1e6035db7c045761bdfc265371d8a23128dbe7\": rpc error: code = NotFound desc = could not find container \"7bb6b0a0b6ac08ba371e40142f1e6035db7c045761bdfc265371d8a23128dbe7\": container with ID starting with 7bb6b0a0b6ac08ba371e40142f1e6035db7c045761bdfc265371d8a23128dbe7 not found: ID does not exist" Oct 03 15:52:33 crc kubenswrapper[4774]: I1003 15:52:33.136539 4774 scope.go:117] "RemoveContainer" containerID="536fe57b48f3ba5f9bb63072721410133e5c02ed0c5083085ed2ebcae4a3413e" Oct 03 15:52:33 crc kubenswrapper[4774]: E1003 15:52:33.136867 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"536fe57b48f3ba5f9bb63072721410133e5c02ed0c5083085ed2ebcae4a3413e\": container with ID starting with 536fe57b48f3ba5f9bb63072721410133e5c02ed0c5083085ed2ebcae4a3413e not found: ID does not exist" containerID="536fe57b48f3ba5f9bb63072721410133e5c02ed0c5083085ed2ebcae4a3413e" Oct 03 15:52:33 crc kubenswrapper[4774]: I1003 15:52:33.136881 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"536fe57b48f3ba5f9bb63072721410133e5c02ed0c5083085ed2ebcae4a3413e"} err="failed to get container status \"536fe57b48f3ba5f9bb63072721410133e5c02ed0c5083085ed2ebcae4a3413e\": rpc error: code = NotFound desc = could not find container \"536fe57b48f3ba5f9bb63072721410133e5c02ed0c5083085ed2ebcae4a3413e\": container with ID starting with 536fe57b48f3ba5f9bb63072721410133e5c02ed0c5083085ed2ebcae4a3413e not found: ID does not exist" Oct 03 15:52:33 crc kubenswrapper[4774]: I1003 15:52:33.321536 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd90a524-88f5-4a73-bef4-48ad019e134c" path="/var/lib/kubelet/pods/bd90a524-88f5-4a73-bef4-48ad019e134c/volumes" Oct 03 15:52:41 crc kubenswrapper[4774]: I1003 15:52:41.080310 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m_53c45343-23b3-4606-a5e1-bdd4c43b2752/util/0.log" Oct 03 15:52:41 crc kubenswrapper[4774]: I1003 15:52:41.612123 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m_53c45343-23b3-4606-a5e1-bdd4c43b2752/util/0.log" Oct 03 15:52:41 crc kubenswrapper[4774]: I1003 15:52:41.668834 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m_53c45343-23b3-4606-a5e1-bdd4c43b2752/pull/0.log" Oct 03 15:52:41 crc kubenswrapper[4774]: I1003 15:52:41.678033 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m_53c45343-23b3-4606-a5e1-bdd4c43b2752/pull/0.log" Oct 03 15:52:41 crc kubenswrapper[4774]: I1003 15:52:41.899236 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m_53c45343-23b3-4606-a5e1-bdd4c43b2752/extract/0.log" Oct 03 15:52:41 crc kubenswrapper[4774]: I1003 15:52:41.906233 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m_53c45343-23b3-4606-a5e1-bdd4c43b2752/pull/0.log" Oct 03 15:52:41 crc kubenswrapper[4774]: I1003 15:52:41.940442 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2nzv5m_53c45343-23b3-4606-a5e1-bdd4c43b2752/util/0.log" Oct 03 15:52:42 crc kubenswrapper[4774]: I1003 15:52:42.095183 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4knpn_a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c/extract-utilities/0.log" Oct 03 15:52:42 crc kubenswrapper[4774]: I1003 15:52:42.232098 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4knpn_a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c/extract-utilities/0.log" Oct 03 15:52:42 crc kubenswrapper[4774]: I1003 15:52:42.241523 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4knpn_a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c/extract-content/0.log" Oct 03 15:52:42 crc kubenswrapper[4774]: I1003 15:52:42.279485 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4knpn_a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c/extract-content/0.log" Oct 03 15:52:42 crc kubenswrapper[4774]: I1003 15:52:42.439864 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4knpn_a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c/extract-content/0.log" Oct 03 15:52:42 crc kubenswrapper[4774]: I1003 15:52:42.467501 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4knpn_a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c/extract-utilities/0.log" Oct 03 15:52:42 crc kubenswrapper[4774]: I1003 15:52:42.729554 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m7x75_8f0d7089-cd30-47db-a3cc-44492151e300/extract-utilities/0.log" Oct 03 15:52:42 crc kubenswrapper[4774]: I1003 15:52:42.884469 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m7x75_8f0d7089-cd30-47db-a3cc-44492151e300/extract-utilities/0.log" Oct 03 15:52:42 crc kubenswrapper[4774]: I1003 15:52:42.905282 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m7x75_8f0d7089-cd30-47db-a3cc-44492151e300/extract-content/0.log" Oct 03 15:52:42 crc kubenswrapper[4774]: I1003 15:52:42.937962 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m7x75_8f0d7089-cd30-47db-a3cc-44492151e300/extract-content/0.log" Oct 03 15:52:42 crc kubenswrapper[4774]: I1003 15:52:42.968277 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4knpn_a6a812a9-1e31-4cdc-ab9c-7f3f7af2977c/registry-server/0.log" Oct 03 15:52:43 crc kubenswrapper[4774]: I1003 15:52:43.118032 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m7x75_8f0d7089-cd30-47db-a3cc-44492151e300/extract-utilities/0.log" Oct 03 15:52:43 crc kubenswrapper[4774]: I1003 15:52:43.209836 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m7x75_8f0d7089-cd30-47db-a3cc-44492151e300/extract-content/0.log" Oct 03 15:52:43 crc kubenswrapper[4774]: I1003 15:52:43.535581 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm_7845d8dc-4399-4450-b1d6-d424a8d64539/util/0.log" Oct 03 15:52:43 crc kubenswrapper[4774]: I1003 15:52:43.722313 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m7x75_8f0d7089-cd30-47db-a3cc-44492151e300/registry-server/0.log" Oct 03 15:52:43 crc kubenswrapper[4774]: I1003 15:52:43.779937 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm_7845d8dc-4399-4450-b1d6-d424a8d64539/util/0.log" Oct 03 15:52:43 crc kubenswrapper[4774]: I1003 15:52:43.834715 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm_7845d8dc-4399-4450-b1d6-d424a8d64539/pull/0.log" Oct 03 15:52:43 crc kubenswrapper[4774]: I1003 15:52:43.841235 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm_7845d8dc-4399-4450-b1d6-d424a8d64539/pull/0.log" Oct 03 15:52:44 crc kubenswrapper[4774]: I1003 15:52:44.022484 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm_7845d8dc-4399-4450-b1d6-d424a8d64539/util/0.log" Oct 03 15:52:44 crc kubenswrapper[4774]: I1003 15:52:44.032473 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm_7845d8dc-4399-4450-b1d6-d424a8d64539/pull/0.log" Oct 03 15:52:44 crc kubenswrapper[4774]: I1003 15:52:44.069421 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c87dpm_7845d8dc-4399-4450-b1d6-d424a8d64539/extract/0.log" Oct 03 15:52:44 crc kubenswrapper[4774]: I1003 15:52:44.177265 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-p2hrr_3fd11a50-e44d-4d7f-b301-6c7069bf6096/marketplace-operator/0.log" Oct 03 15:52:44 crc kubenswrapper[4774]: I1003 15:52:44.239167 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2gxlv_cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6/extract-utilities/0.log" Oct 03 15:52:44 crc kubenswrapper[4774]: I1003 15:52:44.437121 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2gxlv_cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6/extract-utilities/0.log" Oct 03 15:52:44 crc kubenswrapper[4774]: I1003 15:52:44.445646 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2gxlv_cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6/extract-content/0.log" Oct 03 15:52:44 crc kubenswrapper[4774]: I1003 15:52:44.463005 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2gxlv_cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6/extract-content/0.log" Oct 03 15:52:44 crc kubenswrapper[4774]: I1003 15:52:44.618221 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2gxlv_cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6/extract-content/0.log" Oct 03 15:52:44 crc kubenswrapper[4774]: I1003 15:52:44.648141 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2gxlv_cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6/extract-utilities/0.log" Oct 03 15:52:44 crc kubenswrapper[4774]: I1003 15:52:44.778702 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2gxlv_cf28bbfc-dbef-4d85-bf2b-cb550ac0acd6/registry-server/0.log" Oct 03 15:52:44 crc kubenswrapper[4774]: I1003 15:52:44.830660 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mx9vw_f0c1612a-d998-4683-abf4-433f470c76b1/extract-utilities/0.log" Oct 03 15:52:44 crc kubenswrapper[4774]: I1003 15:52:44.987838 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mx9vw_f0c1612a-d998-4683-abf4-433f470c76b1/extract-content/0.log" Oct 03 15:52:45 crc kubenswrapper[4774]: I1003 15:52:45.005907 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mx9vw_f0c1612a-d998-4683-abf4-433f470c76b1/extract-content/0.log" Oct 03 15:52:45 crc kubenswrapper[4774]: I1003 15:52:45.006367 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mx9vw_f0c1612a-d998-4683-abf4-433f470c76b1/extract-utilities/0.log" Oct 03 15:52:45 crc kubenswrapper[4774]: I1003 15:52:45.198110 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mx9vw_f0c1612a-d998-4683-abf4-433f470c76b1/extract-utilities/0.log" Oct 03 15:52:45 crc kubenswrapper[4774]: I1003 15:52:45.198522 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mx9vw_f0c1612a-d998-4683-abf4-433f470c76b1/extract-content/0.log" Oct 03 15:52:45 crc kubenswrapper[4774]: I1003 15:52:45.609959 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mx9vw_f0c1612a-d998-4683-abf4-433f470c76b1/registry-server/0.log" Oct 03 15:52:50 crc kubenswrapper[4774]: I1003 15:52:50.654393 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:52:50 crc kubenswrapper[4774]: I1003 15:52:50.655039 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:53:20 crc kubenswrapper[4774]: I1003 15:53:20.654248 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:53:20 crc kubenswrapper[4774]: I1003 15:53:20.654795 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:53:50 crc kubenswrapper[4774]: I1003 15:53:50.653975 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:53:50 crc kubenswrapper[4774]: I1003 15:53:50.655697 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:53:50 crc kubenswrapper[4774]: I1003 15:53:50.655835 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" Oct 03 15:53:50 crc kubenswrapper[4774]: I1003 15:53:50.656761 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"977d7962c88000b86f065a324347466b050767bdc139cbee70676501833807d1"} pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 15:53:50 crc kubenswrapper[4774]: I1003 15:53:50.656947 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" containerID="cri-o://977d7962c88000b86f065a324347466b050767bdc139cbee70676501833807d1" gracePeriod=600 Oct 03 15:53:50 crc kubenswrapper[4774]: I1003 15:53:50.796578 4774 generic.go:334] "Generic (PLEG): container finished" podID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerID="977d7962c88000b86f065a324347466b050767bdc139cbee70676501833807d1" exitCode=0 Oct 03 15:53:50 crc kubenswrapper[4774]: I1003 15:53:50.796650 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" event={"ID":"ca37ac4b-f421-4198-a179-12901d36f0f5","Type":"ContainerDied","Data":"977d7962c88000b86f065a324347466b050767bdc139cbee70676501833807d1"} Oct 03 15:53:50 crc kubenswrapper[4774]: I1003 15:53:50.796843 4774 scope.go:117] "RemoveContainer" containerID="6a9aac6dbc322b9090f9a59162378fc7732101d04a69222bfa4d228837707411" Oct 03 15:53:51 crc kubenswrapper[4774]: I1003 15:53:51.812076 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" event={"ID":"ca37ac4b-f421-4198-a179-12901d36f0f5","Type":"ContainerStarted","Data":"c00c13cbbf5042cec618bfc3fead6cd3ca75c19d1b83dd6e9f5597c0f86daae6"} Oct 03 15:54:41 crc kubenswrapper[4774]: I1003 15:54:41.351722 4774 generic.go:334] "Generic (PLEG): container finished" podID="d6bade7f-119a-4b23-bbf8-27860297b296" containerID="832d8e112054844bc36903eeb577129ed6bcfbfa04d6e91c7a72367d9b80d29a" exitCode=0 Oct 03 15:54:41 crc kubenswrapper[4774]: I1003 15:54:41.351768 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2968w/must-gather-ftqsx" event={"ID":"d6bade7f-119a-4b23-bbf8-27860297b296","Type":"ContainerDied","Data":"832d8e112054844bc36903eeb577129ed6bcfbfa04d6e91c7a72367d9b80d29a"} Oct 03 15:54:41 crc kubenswrapper[4774]: I1003 15:54:41.352983 4774 scope.go:117] "RemoveContainer" containerID="832d8e112054844bc36903eeb577129ed6bcfbfa04d6e91c7a72367d9b80d29a" Oct 03 15:54:41 crc kubenswrapper[4774]: I1003 15:54:41.987822 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2968w_must-gather-ftqsx_d6bade7f-119a-4b23-bbf8-27860297b296/gather/0.log" Oct 03 15:54:54 crc kubenswrapper[4774]: I1003 15:54:54.366830 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2968w/must-gather-ftqsx"] Oct 03 15:54:54 crc kubenswrapper[4774]: I1003 15:54:54.367666 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-2968w/must-gather-ftqsx" podUID="d6bade7f-119a-4b23-bbf8-27860297b296" containerName="copy" containerID="cri-o://935050b4acabd37437c5ca63cd170014713d0b0231d78ab6811fef5a1e5692fb" gracePeriod=2 Oct 03 15:54:54 crc kubenswrapper[4774]: I1003 15:54:54.377319 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2968w/must-gather-ftqsx"] Oct 03 15:54:54 crc kubenswrapper[4774]: I1003 15:54:54.810461 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2968w_must-gather-ftqsx_d6bade7f-119a-4b23-bbf8-27860297b296/copy/0.log" Oct 03 15:54:54 crc kubenswrapper[4774]: I1003 15:54:54.811204 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2968w/must-gather-ftqsx" Oct 03 15:54:54 crc kubenswrapper[4774]: I1003 15:54:54.856335 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28hrx\" (UniqueName: \"kubernetes.io/projected/d6bade7f-119a-4b23-bbf8-27860297b296-kube-api-access-28hrx\") pod \"d6bade7f-119a-4b23-bbf8-27860297b296\" (UID: \"d6bade7f-119a-4b23-bbf8-27860297b296\") " Oct 03 15:54:54 crc kubenswrapper[4774]: I1003 15:54:54.856535 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d6bade7f-119a-4b23-bbf8-27860297b296-must-gather-output\") pod \"d6bade7f-119a-4b23-bbf8-27860297b296\" (UID: \"d6bade7f-119a-4b23-bbf8-27860297b296\") " Oct 03 15:54:54 crc kubenswrapper[4774]: I1003 15:54:54.863919 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6bade7f-119a-4b23-bbf8-27860297b296-kube-api-access-28hrx" (OuterVolumeSpecName: "kube-api-access-28hrx") pod "d6bade7f-119a-4b23-bbf8-27860297b296" (UID: "d6bade7f-119a-4b23-bbf8-27860297b296"). InnerVolumeSpecName "kube-api-access-28hrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:54:54 crc kubenswrapper[4774]: I1003 15:54:54.958912 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28hrx\" (UniqueName: \"kubernetes.io/projected/d6bade7f-119a-4b23-bbf8-27860297b296-kube-api-access-28hrx\") on node \"crc\" DevicePath \"\"" Oct 03 15:54:55 crc kubenswrapper[4774]: I1003 15:54:55.016696 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6bade7f-119a-4b23-bbf8-27860297b296-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "d6bade7f-119a-4b23-bbf8-27860297b296" (UID: "d6bade7f-119a-4b23-bbf8-27860297b296"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:54:55 crc kubenswrapper[4774]: I1003 15:54:55.060650 4774 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d6bade7f-119a-4b23-bbf8-27860297b296-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 03 15:54:55 crc kubenswrapper[4774]: I1003 15:54:55.310659 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6bade7f-119a-4b23-bbf8-27860297b296" path="/var/lib/kubelet/pods/d6bade7f-119a-4b23-bbf8-27860297b296/volumes" Oct 03 15:54:55 crc kubenswrapper[4774]: I1003 15:54:55.493986 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2968w_must-gather-ftqsx_d6bade7f-119a-4b23-bbf8-27860297b296/copy/0.log" Oct 03 15:54:55 crc kubenswrapper[4774]: I1003 15:54:55.494447 4774 generic.go:334] "Generic (PLEG): container finished" podID="d6bade7f-119a-4b23-bbf8-27860297b296" containerID="935050b4acabd37437c5ca63cd170014713d0b0231d78ab6811fef5a1e5692fb" exitCode=143 Oct 03 15:54:55 crc kubenswrapper[4774]: I1003 15:54:55.494512 4774 scope.go:117] "RemoveContainer" containerID="935050b4acabd37437c5ca63cd170014713d0b0231d78ab6811fef5a1e5692fb" Oct 03 15:54:55 crc kubenswrapper[4774]: I1003 15:54:55.494518 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2968w/must-gather-ftqsx" Oct 03 15:54:55 crc kubenswrapper[4774]: I1003 15:54:55.515189 4774 scope.go:117] "RemoveContainer" containerID="832d8e112054844bc36903eeb577129ed6bcfbfa04d6e91c7a72367d9b80d29a" Oct 03 15:54:55 crc kubenswrapper[4774]: I1003 15:54:55.589934 4774 scope.go:117] "RemoveContainer" containerID="935050b4acabd37437c5ca63cd170014713d0b0231d78ab6811fef5a1e5692fb" Oct 03 15:54:55 crc kubenswrapper[4774]: E1003 15:54:55.590802 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"935050b4acabd37437c5ca63cd170014713d0b0231d78ab6811fef5a1e5692fb\": container with ID starting with 935050b4acabd37437c5ca63cd170014713d0b0231d78ab6811fef5a1e5692fb not found: ID does not exist" containerID="935050b4acabd37437c5ca63cd170014713d0b0231d78ab6811fef5a1e5692fb" Oct 03 15:54:55 crc kubenswrapper[4774]: I1003 15:54:55.590855 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"935050b4acabd37437c5ca63cd170014713d0b0231d78ab6811fef5a1e5692fb"} err="failed to get container status \"935050b4acabd37437c5ca63cd170014713d0b0231d78ab6811fef5a1e5692fb\": rpc error: code = NotFound desc = could not find container \"935050b4acabd37437c5ca63cd170014713d0b0231d78ab6811fef5a1e5692fb\": container with ID starting with 935050b4acabd37437c5ca63cd170014713d0b0231d78ab6811fef5a1e5692fb not found: ID does not exist" Oct 03 15:54:55 crc kubenswrapper[4774]: I1003 15:54:55.590888 4774 scope.go:117] "RemoveContainer" containerID="832d8e112054844bc36903eeb577129ed6bcfbfa04d6e91c7a72367d9b80d29a" Oct 03 15:54:55 crc kubenswrapper[4774]: E1003 15:54:55.591301 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"832d8e112054844bc36903eeb577129ed6bcfbfa04d6e91c7a72367d9b80d29a\": container with ID starting with 832d8e112054844bc36903eeb577129ed6bcfbfa04d6e91c7a72367d9b80d29a not found: ID does not exist" containerID="832d8e112054844bc36903eeb577129ed6bcfbfa04d6e91c7a72367d9b80d29a" Oct 03 15:54:55 crc kubenswrapper[4774]: I1003 15:54:55.591338 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"832d8e112054844bc36903eeb577129ed6bcfbfa04d6e91c7a72367d9b80d29a"} err="failed to get container status \"832d8e112054844bc36903eeb577129ed6bcfbfa04d6e91c7a72367d9b80d29a\": rpc error: code = NotFound desc = could not find container \"832d8e112054844bc36903eeb577129ed6bcfbfa04d6e91c7a72367d9b80d29a\": container with ID starting with 832d8e112054844bc36903eeb577129ed6bcfbfa04d6e91c7a72367d9b80d29a not found: ID does not exist" Oct 03 15:55:27 crc kubenswrapper[4774]: I1003 15:55:27.815655 4774 scope.go:117] "RemoveContainer" containerID="0f125f54ee63c120a173107e0f934ef6dfaec0b7ef3cb99aceb30884c4e5a1ef" Oct 03 15:56:20 crc kubenswrapper[4774]: I1003 15:56:20.653764 4774 patch_prober.go:28] interesting pod/machine-config-daemon-s6v5z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:56:20 crc kubenswrapper[4774]: I1003 15:56:20.654685 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6v5z" podUID="ca37ac4b-f421-4198-a179-12901d36f0f5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"